What does it take to push back against disinformation in a world of fragmented attention, low trust and unaccountable algorithms?
That was the question that shaped the fifth Thomson Talks session, held at the Cambridge Disinformation Summit 2025 in the university’s historic debating chamber. Organised in partnership with the Cambridge Judge Business School, the event brought together leading voices from journalism, technology, academia and policy to examine what’s working in the fight against disinformation, what isn’t, and where attention is urgently needed.
The newly released report Disarming Disinformation in the Media: What Works, What Doesn’t and Why distils key insights from the discussion and lays out an agenda for cross-sector action. Held under the Chatham House Rule, the session moved beyond diagnosis and into the practical, outlining how media, educators, civil society and regulators can better respond to today’s challenges while protecting the core principles of free expression.
Participants agreed that journalism’s traditional reliance on reactive fact-checking is no longer sufficient. Countering today’s fast-moving disinformation landscape requires investigative strategies that expose who is spreading falsehoods, how, and why, before they take hold.
Recommendations include systematic “pre-bunking,” treating disinformation as a business to be investigated and showing audiences the process behind reporting. Credibility, speakers argued, comes not just from accuracy, but from transparency, empathy and context, particularly when engaging sceptical or marginalised audiences.
Disinformation spreads through a fragmented and emotionally driven media environment. The report calls for national media and information literacy programmes to equip young people with critical thinking tools from an early age, and for culturally grounded responses that engage communities offline as well as online.
Speakers emphasised the importance of collaboration - between journalists, educators, civil society organisations, and tech platforms - to develop localised strategies and rapid-response coalitions that can adapt to shifting disinformation tactics.
Many at the session agreed that voluntary platform regulation has failed to stem the tide of harmful content. The report highlights several measures with the potential to shift the balance:
Some participants indicated that tech platforms face a patchwork of local regulations, which allow disinformation providers to exploit jurisdictional gaps. They argued that without a structural shift in how digital platforms are governed, efforts to counter disinformation will remain fragmented and ineffective.
The report calls for a shared framework to define what success looks like in the fight against disinformation and clear metrics to measure it. “Without clarity on goals and accountability, the risk is that well-meaning interventions may not go far enough, or may even backfire,” says Thomson’s chief executive Caro Kriel. “No single actor — whether media, government, civil society or tech — can tackle this alone. Progress will depend on sustained cross-sector collaboration”.
Read the Thomson Talks at Cambridge report in full here.