This episode examines the balance between human judgment and AI in journalism, with insights from the INMA Congress and the Hearst UK case. We discuss ethical dilemmas like bias in AI systems and audience skepticism, alongside the fallout from Apple's AI-generated news alert failure. Learn how AI is reshaping trust and integrity in the newsroom.
Fiona Walsh
Liam, here's a question to kick us offâshould we feel nervous about AI's growing role in journalism? I mean, the focus at INMA's 2024 panel was all about resilience. But there's this undercurrent, right? A kind of unease around how much weâre automating and whether itâs, you know, chipping away at the human core of investigative reporting.
Liam Murphy
Yeah, yeah, I've been thinking about that. Like, on one hand... AIâs great at handling the repetitive stuffâlegwork, stats, maybe even sniffing out patterns no one expected. But letâs be realâimagine breaking a major corruption story and then trying to explain, âOh, our bot found it.â It doesnât sit well.
Fiona Walsh
Right? And the leaders at companies like Sky, Hearst UKâthey get it. Theyâre not just diving headfirst into full automation. Theyâre careful, balancing AIâs capabilities without sidelining... the human instinct that drives nuanced storytelling.
Liam Murphy
Totally, but get thisâone of the big debates from the Congress was whether to buy or build this AI tech. So, like, do you invest in stuff off the shelf and risk becoming dependent on someone elseâs roadmap, or go full DIY, and throw a ton of resourcesâpeople, time, moneyâat developing your own system?
Fiona Walsh
Hmm. I can see that. Building in-house gives more control, sure, but the cost? Massive. And then youâve got ethical considerations popping up left and right. I mean, algorithmic bias, data ownershipâitâs not just a tech debate, is it?
Liam Murphy
Nope, itâs cultural, itâs ethical, itâs like an ecosystem tug-of-war. Youâve got transparency versus speed versus trust. Like, thereâs this statâit floored meâonly 8% of Brits think AIâs a force for good. Eight percent! That tells you everything about how shaky the ground is for public perception.
Fiona Walsh
And without transparency, that perception wonât improve. If readers feel theyâre being sold AI-driven stories without knowing, it chips away at trust. Companies like Hearst are trying to show more transparency... but is it enough?
Liam Murphy
And is it genuine? Hear thisâtheyâre even considering whether they still need people to interact with human-made versus AI content. Itâs insane! People crave stories that feel human-driven, but companies walk this line where AI feeds efficiency. It's almost like a paradox, yâknow?
Fiona Walsh
It really is. You need that balance, though... keeping humanity at the wheel, steering the narrative, with AI as a co-pilotânot the driver.
Fiona Walsh
Absolutely, Liam. That balance is critical, but it gets even murkier when you consider AI bias. Speaking of trust and transparency, I was looking at the study âArtificial Intelligence in Newsrooms,â and itâs unsettling. It revealed how these systems often mirror biases already present in the data they're fed. Itâs likeâif we can't even trust the dataset, how do we trust the output?
Liam Murphy
Yeah, yeah, exactly! Like, if the dataâs skewed, the AI just inherits those flaws. Itâs kinda wildâtheyâre supposed to make things better, but instead, they sometimes double down on the exact problems theyâre meant to solve. Thatâs... well, ironic, isnât it?
Fiona Walsh
It really is. The journalists at Al Mamlaka TV raised another concernâprivacy violations. Apparently, AI tools can pull sensitive information without proper oversight. Thatâs a huge red flag. How can journalists maintain their credibility when the tools theyâre using could compromise privacy?
Liam Murphy
Oh, big time. And letâs not forget accountability. Imagine running a story based on AI findings, and it turns out the data was wrongâor biased. Whoâs to blame? The journalist? The tech company? Itâs a mess, honestly. Without clear accountability, the risks to journalistic integrity are enormous.
Fiona Walsh
And it shakes audience trust. If readers canât trust the sources or suspect AI is being used carelessly, it just erodes confidence in journalism altogether.
Liam Murphy
Totally! Itâs why this gap in legislation is such a big deal. AI is racing ahead, but the rules and guidelines... yeah, theyâre crawling behind. How do you even start to craft ethics for something thatâs evolving so quickly? Should we hold AI to the same moral standards as human journalists? Or is that just wishful thinking?
Fiona Walsh
Maybe it is, to a degree. But thereâs gotta be some standardâsome baseline accountability. Otherwise, itâs a free-for-all. And itâs not just about what AI can do but what we let it do, ethically. To borrow a thought from the study, they said journalists have to own the processâbeginning to endâif they want to protect transparency and trust.
Liam Murphy
Exactly, but hereâs the kickerâAI doesnât just challenge the process; it challenges the profession. I mean, who decides how this technology fits in? Is it the developers, the editors, or... no one? 'Cause right now, it feels like the Wild West, with everyone just making it up as they go.
Fiona Walsh
Youâre right. Without legislative guardrails, itâs all reactive instead of proactive. And thatâs a perilous way to handle tools this powerful.
Liam Murphy
You know, Fiona, after everything we just talked about regarding ethics and AI in journalism, it reminded meâdid you hear about the Apple AI-generated news fiasco?
Fiona Walsh
Oh, absolutely, Liam. That was... unsettling, to say the least. When you think about it, a simple misstep like the Luigi Mangione headline can ripple out and completely shatter public trust in AI systemsâand in journalism altogether.
Liam Murphy
Yeah, I mean, itâs one thing for AI to summarize stats or crunch numbers, but presenting outright false headlines dressed as credible? Thatâs a whole other level of... well, dangerous.
Fiona Walsh
Exactly. And what stood out to me is how quickly the National Union of Journalists jumped in. Their push to suspend the service wasnât just about resolving glitchesâit was about showing the public that accuracy and trust still matter. It's refreshing to see accountability prioritized, even in the face of innovation.
Liam Murphy
Totally. But it raises such a bigger question, doesnât it? Like, what happens when errors go unchecked in systems that big? Think about how this could play out in more critical areasâenvironmental reporting, political coverage, even health news. The stakes are way too high to leave this stuff to chance.
Fiona Walsh
Definitely. And thatâs why transparency is non-negotiable. If companies donât disclose when and how AI plays a role, audiences are left guessing, and trust, again, starts to erode. Itâs a slippery slope, Liam.
Liam Murphy
Right, right. And whatâs wild is this paradox, yeah? AI was supposed to be the efficiency hero, making journalism sharper and more reliable. But here we are, circling back to good old human oversight to avoid disasters. Itâs almost poetic... but also infuriating.
Fiona Walsh
It is! And if we want these tools to serve us and not the other way around, then human-led frameworksâbe it ethical guidelines, legislation, or just plain ownership over the processâare crucial. We canât just fling open the gates and hope for the best.
Liam Murphy
Couldnât agree more. But look, if thereâs one silver lining, itâs how moments like these force us to pause, right? To reflect before barreling ahead. Appleâs pulling back on the service mightâve been a reaction, but itâs the kind of caution we need more of, across the board.
Fiona Walsh
Absolutely, Liam. And maybe thatâs the real takeaway here. AI is a powerful tool, sureâbut without humanity steering it, we risk losing not just accuracy, but trust, creativity, and everything that makes journalism meaningful. On that note, letâs keep asking the tough questions, yeah?
Liam Murphy
And letâs make sure weâre the ones leadingânot lagging behind the machines. Great chat today, Fiona. And to our listeners, thanks for tuning in. Weâll see you next time!
About the podcast
You Wouldn't Download a Human is a playful podcast celebrating AI-generated content. Hosted by our charming Android Hosts, Liam Murphy and Fiona Walsh, we dive into intriguing topics and showcase the creativity of AI. Join us for an entertaining journey that celebrates the unique world of AI!
This podcast is brought to you by Jellypod, Inc.
© 2025 All rights reserved.