Humility, humanity, and the hard questions: AI pathfinder Susan Etlinger on making meaning from data

Share this post

In the latest episode of Data Malarkey – the podcast about using data, smarter – Master Data Storyteller, Sam Knowles, is joined by Susan Etlinger, one of the world’s most thoughtful voices on artificial intelligence, responsible technology, and data ethics. A veteran of Microsoft’s Azure AI division and former host of the Leading the Shift podcast, Susan’s 2014 TED at IBM talk, “What do we do with all this big data?”, remains as relevant today as it was more than a decade ago. Their conversation explores what’s changed – and what hasn’t – in how we make sense of an increasingly data-driven world.

“Data doesn’t create meaning – we do”

At the heart of Susan’s thinking is a simple but profound truth: data does not create meaning – people do. Machines can process exabytes of data at lightning speed, but without context, understanding, and human judgement, they risk amplifying bias or error at scale. “The more we automate,” Susan observes, “the higher the burden of proof that we’re automating things we should automate.”

This theme of intentionality runs throughout the discussion like a stick of Brighton rock. Whether in AI, analytics, or automation, Susan argues that we are at a critical inflection point – “intentionally ceding control to machines” – and that our choices now will define how technology shapes the future of work and society. For her, humility is key: the recognition that we do not and cannot know everything. Echoing the Socratic paradox (“All that I know is that I know nothing”) she believes humility and curiosity are the best safeguards against hubris in the age of AI.

Looking to the edges for evidence

Susan’s insights are grounded in both research and lived experience. Her moving story of her autistic son’s inventive use of Google to explore the world – typing “WIMEN” into a search bar aged three – demonstrates how unconventional forms of communication can reveal new ways of thinking. It’s a parable for the data-driven age: when we overvalue one kind of signal and overlook others, we risk missing meaning altogether.

Sam and Susan’s conversation also touches on bias, context, and inclusion. AI systems, Susan warns, learn from the data we feed them and the internet reflects a limited slice of human experience. “By virtue of the fact that the data training the underlying models is constrained, that will constrain the output,” she says. Encouragingly, she points to efforts in Africa and beyond to build models that reflect greater linguistic and cultural diversity.

Looking ahead, Susan’s prescription for navigating this era is refreshingly practical: study history, understand statistics, and learn how data and inferencing models actually work. Add to that a dose of sociology and anthropology – to keep sight of the human story behind the numbers – and perhaps we aren’t all going to hell in a handcart after all.

Summing up

Ultimately, Susan calls for a blend of critical thinking and compassion, plus the intellectual rigour to question what the data say, and the empathy to understand what they mean. As she puts it, technology is both “a tool and a weapon”. Whether it harms or helps depends entirely on the intentions and the integrity of those who use it.

The first draft of this blog was written by ChatGPT, based on the transcript of the podcast

Read the 500-word summary blog of the latest episode

Insight Agents
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.