The Media Copilot

The Media Copilot

Hosted by journalist Pete Pachal, The Media Copilot is a weekly conversation with smart people on how AI is changing media, journalism, and the news.

  1. 1D AGO

    Search is Changing Fast. Is Your Brand Ready for the Answer Engine Era?

    Search is changing fast, and AI is at the center of it. Search is no longer just about blue links and ranking on Google. More and more, people are getting their answers directly from AI tools like ChatGPT, Google AI Overviews, and other answer engines that summarize information, pull citations, and decide what gets surfaced in real time. That means visibility is changing, and so is the value of content. In this episode of The Media Copilot, Pete Pachal speaks with Josh Blyskal, who leads answer engine optimization research at Profound, a company focused on tracking how brands appear inside AI generated answers. Their conversation explores what answer engine optimization really means, how it differs from traditional SEO, and why specificity, utility, and structure now matter more than ever. What We Cover  • The shift from traditional search results to AI generated answers • What answer engine optimization (AEO) actually means • How AI tools break prompts into “fan out” searches • Why specificity and structured content matter more than ever • The role of citations and consensus in AI responses • How platforms like ChatGPT and Google AI Overviews choose sources • Why Reddit and user generated content still influence AI answers • The growing tension between AI discovery and publisher business models • Opportunities and risks for media organizations in the AI search era About the 👤 Guest Josh Blyskal • Website: https://www.joshblyskal.com • LinkedIn: https://www.linkedin.com/in/joshua-blyskal • X: https://twitter.com/joshblyskal • Speaker Deck: https://speakerdeck.com/joshbly Learn More About Profound • https://www.tryprofound.com About the show: To explore more conversations like this and see what’s new, visit the freshly updated Media Copilot website at mediacopilot.ai. You’ll find new episodes, expanded resources, and tools designed for journalists, communicators, and media leaders navigating the fast-changing world of AI. It’s the home base for everything Media Copilot and it’s just getting started. Enjoyed this episode? Subscribe to The Media Copilot on Substack, Apple Podcasts, Spotify, or your favorite app. On YouTube?  Tap the Like button and Subscribe to the YouTube channel. For more AI tools and resources built for media professionals, visit MediaCopilot.ai. Produced by Pete Pachal and Executive Producer Michele MussoEdited by the Musso Media Team  Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0 All rights reserved. © AnyWho Media 2026

    47 min
  2. MAR 5

    Building the Newsroom AI Playbook Without Turning Journalism into Slop

    AI is rapidly becoming part of how news is produced, distributed, and discovered. But what does that actually look like inside a newsroom? In this episode of The Media Copilot, host Pete Pachal speaks with Gina Chua, Executive Editor at Large at Semafor and Executive Director of the Tow-Knight Center for Journalism Futures at the CUNY Graduate School of Journalism. Chua shares how Semafor is experimenting with practical AI tools that support journalists in everyday workflows. These include tools for copy editing and proofreading, systems that suggest relevant datasets for charts while a reporter is writing, and tools that help surface related reporting across different outlets and languages. The conversation also explores how newsrooms can organize large volumes of information. At Semafor, interview transcripts from events and panels are integrated into internal systems so reporters can quickly search conversations, locate quotes, and review context directly. Chua emphasizes that these tools are designed to assist newsroom work rather than replace editorial judgment. She also offers a useful way to think about large language models: they are built to work with language, not to verify facts. When used carefully with known text sources, they can help summarize, organize, and analyze information. Beyond newsroom workflows, the discussion turns to the broader shift happening in how people access information. AI tools, chatbots, and automated summaries are increasingly becoming a gateway to news, which raises important questions about trust, verification, and the future role of journalism. This episode looks at how reporters, editors, and media organizations are adapting as AI becomes part of the information ecosystem. What we cover • How Semafor is experimenting with AI tools inside the newsroom • Using transcripts and Slack to search interviews and discussions • Why language models are useful for handling text but not verifying facts • The role of human review in newsroom publishing decisions • How AI interfaces are changing the way audiences find news TIMESTAMPS: 00:00 – Intro: Journalism in the AI Era 02:15 – Gina’s Background & Semafor’s Model 06:00 – How Newsrooms Are Using AI Today 10:00 – Trust in a Synthetic World 14:00 – Transparency & Disclosure 18:30 – AI Tools Inside Reporting 23:00 – The Risk of Information Overload 27:30 – Reinventing the News Business 32:00 – Where AI Helps Most 36:30 – The Future of Journalism About the 👤 Guest GINA CHUA LinkedIn 👉 https://www.linkedin.com/in/ginachua X (Twitter) 👉 https://x.com/GinaSKChua Instagram 👉 https://www.instagram.com/gina_chua_nyc  Personal Website / Writing 👉 https://ginachua.me  Author Page (Semafor) 👉 https://www.semafor.com/author/gina-chua —- About the show: To explore more conversations like this and see what’s new, visit the freshly updated Media Copilot website at mediacopilot.ai. You’ll find new episodes, expanded resources, and tools designed for journalists, communicators, and media leaders navigating the fast-changing world of AI. It’s the home base for everything Media Copilot and it’s just getting started. Produced by Pete Pachal and Executive Producer Michele MussoEdited by the Musso Media Team  Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0 All rights reserved. © AnyWho Media 2026

    51 min
  3. FEB 26

    She’s Building the AI Agent That Could Replace Your News Feed

    What if instead of scrolling headlines, you had a personal intelligence agent that understood what matters to you and delivered only signal, not noise? In this episode of The Media Copilot, Pete Pachal talks with Eva Cicinyte, co-founder and CEO of Gnomi, an AI-powered real-time news agent designed to synthesize global information into actionable insight. The goal isn’t summaries or more feeds. It’s context. Eva explains how her experience in political data analytics shaped her mission to make high quality understanding accessible to everyone, not just institutions with research teams. Gnomi pulls from global sources, social platforms, video, audio, and financial data to deliver personalized intelligence in real time. The platform’s new Finance Mode can even analyze live earnings calls as they happen, potentially surfacing market signals before headlines move prices.🔍 In this conversation  • Why Gnomi is built as an “intelligence layer,” not a news app • How AI agents could replace search and traditional feeds • The danger of engagement driven AI systems • Multilingual analysis and global perspective gaps • Using social and video data to detect emerging signals • Real time market insights from live earnings calls • The future of journalism in an AI first world • Ads, subscriptions, and the economics of AI tools If you care about the future of news, AI, finance, or how people will stay informed in the coming decade, this episode is a must watch. 00:00 – Intro: Why AI Agents Matter Now  Big-picture framing of the agent shift. 02:10 – Eva’s Background & Building Gnomi  How she entered the agent space and what problem they’re solving. 05:40 – What Actually Is an AI Agent?  Clear distinction between chatbots and agents. 09:15 – From Answers to Action  How agents move from generating text to executing workflows. 13:50 – Designing Guardrails & Trust  Why autonomy requires control and reliability. 18:20 – Real-World Use Cases  Where agents are already creating leverage. 22:45 – AI in the Workflow Stack  Replacing apps and orchestrating tools. 27:30 – Human + AI Collaboration  Why agents amplify people instead of replacing them. 32:10 – Infrastructure: Memory, Context & Systems  What makes agents actually autonomous. 37:00 – Competitive Advantage in the Agent Era  How companies should think about adoption. 41:30 – The Future of the Agent Economy  Where this is all headed next. About the 👤 Guest  LinkedIn 👉 https://www.linkedin.com/in/eva-cicinyte-1447161b2 Instagram (Personal) 👉 https://www.instagram.com/evapariscicinyte Official Website 👉 https://www.gnomi.com LinkedIn (Company Page) 👉 https://www.linkedin.com/company/gnomi Instagram (Company)👉 https://www.instagram.com/gnomi.app  About the show: To explore more conversations like this and see what’s new, visit the freshly updated Media Copilot website at mediacopilot.ai. You’ll find new episodes, expanded resources, and tools designed for journalists, communicators, and media leaders navigating the fast-changing world of AI. It’s the home base for everything Media Copilot and it’s just getting started. Enjoyed this episode? Subscribe to The Media Copilot on Substack, Apple Podcasts, Spotify, or your favorite app. On YouTube?  Tap the Like button and Subscribe to the YouTube channel. For more AI tools and resources built for media professionals, visit MediaCopilot.ai.Produced by Pete Pachal and Executive Producer Michele MussoEdited by the Musso Media Team Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0All rights reserved. © AnyWho Media 2026

    43 min
  4. FEB 19

    Fake News at Machine Speed: Inside AI’s Impact on Media Trust

    Poynter’s Alex Mahadevan explains how newsrooms can use AI without losing the fundamentals of verification, context, and accountability. By The Copilot AI is already embedded in how people discover and consume news, from search to chat interfaces to automated summaries. So the question is no longer whether journalism will be shaped by AI. It’s how newsrooms maintain trust while experimenting responsibly. In this episode of The Media Copilot podcast, Pete Pachal sits down with Alex Mahadevan, Director of MediaWise and a faculty member at Poynter, to unpack what media literacy looks like now that anyone can generate convincing content at scale. Alex shares how his background in data and local journalism shaped his approach to tools, why public-facing AI ethics policies matter, and what it will take for news organizations to bring audiences along for the next phase of the information ecosystem. Why this matters Trust is the core product. AI can either widen the trust gap with errors and low-quality content, or help rebuild credibility through transparency, better products, and clearer communication about how journalism is made. This conversation gets practical about what responsible AI use looks like, where disclosures help and where they can unintentionally slow innovation, and why the newsroom AI divide is becoming a real competitive advantage for organizations that adapt. What we cover • Alex’s journey into journalism and the global mission of MediaWise • How AI is reshaping misinformation, trust, and newsroom transparency • Practical uses of chatbots, coding agents, and AI workflows • The widening divide between AI enthusiasts and skeptics in newsrooms • Ethics, job concerns, and gray areas around AI-assisted writing • What the future of news may look like beyond traditional articles About the 👤 Guest  🔗Alex Mahadevan  🔗Poynter / MediaWise  🔗MediaWise About the show: To explore more conversations like this and see what’s new, visit the freshly updated Media Copilot website at mediacopilot.ai. You’ll find new episodes, expanded resources, and tools designed for journalists, communicators, and media leaders navigating the fast-changing world of AI. It’s the home base for everything Media Copilot and it’s just getting started. Enjoyed this episode? Subscribe to The Media Copilot on Substack, Apple Podcasts, Spotify, or your favorite app. On YouTube?  Tap the Like button and Subscribe to the YouTube channel. For more AI tools and resources built for media professionals, visit MediaCopilot.ai. Produced by Pete Pachal and Executive Producer Michele MussoEdited by the Musso Media Team  Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0 All rights reserved. © AnyWho Media 2026

    51 min
  5. FEB 5

    AI, and copyright: How media can decide between litigation or negotiation

    Lawsuits set public rules; contracts set private ones. A media attorney on how leverage, timing, and context decide the path. In this episode, Pete Pachal sits down with corporate and transactional attorney Jason Henderson, a streaming and licensing specialist who also happens to be a creative with real skin in the game. Jason breaks down why the popular “AI learns like humans” analogy only goes so far, how fair use really works in court, and why the future will be shaped less by courtroom theory and more by deal structures. The key parts of those deals that are often overlooked: indemnification and who actually bears the risk when things go sideways. From The New York Times and Perplexity headlines to the practical mechanics of licensing training data, this conversation gets grounded fast. Jason explains what matters most to media companies, what smaller publishers should watch, and why agentic browsing and attribution are shaping up to be the next pressure point. Why this matters: Media is facing a new kind of competition. Not always a stolen article, but a substituted experience. When AI tools summarize, synthesize, and answer in real time, the legal question is not only “Was it copied?” It is also “Does it replace the market for the original?” Jason outlines how courts evaluate that, why “transformative” is both the key term and the messiest one, and why the industry is drifting toward partnerships and licensing frameworks even as litigation continues. At the same time, the next wave is not just training bots or search bots. It is agents that behave like users and may be harder to block or even detect. The more AI becomes the interface to the web, the more urgent it becomes for publishers to understand the business and legal stakes. Key Takeaways Fair use is not a blanket shield. Courts look at purpose, transformation, and market impact, and the facts matter. Legitimate acquisition matters. Even if a use might be transformative, piracy can change the legal posture dramatically. Media’s biggest fear is substitution. Summaries and AI answers can erode subscriptions, traffic, and trust, even without verbatim copying. Deals are becoming more specific. Expect narrower permissions and more constraints on how data can be used for training or product features. Risk is moving through contracts. Indemnification is common, but it is only as strong as the indemnifier’s balance sheet and insurance. Attribution is the missing bridge. A clear “this came from” pathway could reduce conflict and rebuild value for original publishers. Agentic browsing will raise the temperature. When AI acts as a user proxy, blocking and enforcement become harder, and the business questions get sharper. 👤 Guest 🔗Jason Henderson    🔗Senior Attorney, JWL International    🔗Founder, Castle Bridge Media      🔗Co-host, Castle of Horror podcast (horror movie coverage) About the show: To explore more conversations like this and see what’s new, visit the freshly updated Media Copilot website at mediacopilot.ai. You’ll find new episodes, expanded resources, and tools designed for journalists, communicators, and media leaders navigating the fast-changing world of AI. It’s the home base for everything Media Copilot and it’s just getting started. Enjoyed this episode? Subscribe to The Media Copilot on Substack, Apple Podcasts, Spotify, or your favorite app. On YouTube?  Tap the Like button and Subscribe to the channel.  For more AI tools and resources built for media professionals, visit MediaCopilot.ai. Produced by Pete Pachal and Executive Producer Michele MussoEdited by the Musso Media Team  Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0 All rights reserved. © AnyWho Media 2026

    51 min
  6. JAN 29

    Teaching journalists to use AI without losing critical thinking

    A tech-forward journalism professor unpacks how AI is changing how he teaches reporting and what it means for the entry-level jobs that are increasingly endangered.  AI is not just changing how journalism gets made. It is changing how journalism gets taught. In this episode of The Media Copilot, host Pete Pachal sits down with Kris Hodgson-Bright, professor of digital communications and media at Lethbridge Polytechnic in Alberta, Canada, to unpack what happens when AI enters the newsroom and the classroom at the same time. Kris has seen journalism education evolve from high-volume print production to an online first, multi-platform workflow spanning campus news, radio, TV, and emerging formats. Now, he is putting AI directly into the curriculum, not as a shortcut for writing, but as a research assistant that can strengthen reporting, sharpen critical thinking, and help students confront one of the biggest challenges in modern media: bias and trust. Pete and Kris explore where AI fits in journalism training, where it doesn’t, and why transparent guardrails matter. They also dig into the job market reality for new journalists and communicators, plus the promise of immersive storytelling, including 360-degree video, VR, and photogrammetry, as a way to deepen understanding and empathy. Along the way, the conversation surfaces some of the most difficult questions facing the media right now: how much automation is too much, where responsibility still sits with the human journalist, and how educators can prepare students for an industry that is evolving faster than any syllabus.  This is a grounded conversation about the future of media work: hopeful about what AI can enhance, and clear-eyed about the slippery slope toward low quality content and atrophied thinking. Why this matters As AI becomes embedded in every part of media, the next generation of journalists and communicators will be judged on more than writing skills. They will be judged on judgment: bias awareness, ethical decision-making, transparency, and the ability to use tools without surrendering the work of thinking. What we cover How journalism education shifted from print heavy production to online-first publishing The right way to integrate AI into student workflows without outsourcing the writing Using AI to check for bias and improve historical context in local reporting What transparency and disclosure should look like in AI-assisted media Media law, ethics, privacy, and how to teach responsible AI use Why the journalism job market is harder and what students can do to stand out Immersive journalism, empathy, and what VR still gets right even without mass adoption Kris’s hopes and fears about AI’s long-term impact on media👤 Guest 🔗Kris Hodgson-Bright | Lethbridge Polytechnic🔗Kris Hodgson-Bright (@hodgsonkr) / Posts / X 🔗krishodgsonbright/LinkedIN  To explore more conversations like this and see what’s new, visit the freshly updated Media Copilot website at mediacopilot.ai. You’ll find new episodes, expanded resources, and tools designed for journalists, communicators, and media leaders navigating the fast-changing world of AI. It’s the home base for everything Media Copilot and it’s just getting started. Enjoyed this episode? Subscribe to The Media Copilot on Substack, Apple Podcasts, Spotify, or your favorite app. On YouTube?  Tap the Like button and Subscribe to the channel.  For more AI tools and resources built for media professionals, visit MediaCopilot.ai Produced by Pete Pachal and Executive Producer Michele MussoEdited by the Musso Media Team  Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0 All rights reserved. © AnyWho Media 2025

    40 min
  7. JAN 16

    Why Yahoo Still Matters and What It Knows About the Future of News

    A candid look at how aggregation, personalization, and trust shape news discovery in an AI-driven internet. Yahoo has been part of the internet’s front door for more than two decades. But what does it mean to guide audiences through news today, when consumption is fragmented, trust is fragile, and AI is reshaping how information is found, summarized, and shared? In this Season 4 conversation of The Media Copilot, Pete Pachal sits down with Kat Downs Mulder, GM of Yahoo News, to unpack how one of the largest digital media platforms in the world is rethinking aggregation, personalization, and user habits in the age of AI. From audio-first experiences and AI-powered summaries to the integration of Artifact’s technology into the Yahoo News app, Mulder explains how Yahoo is balancing innovation with responsibility while supporting original journalism across a noisy, algorithm-driven ecosystem. Why This Matters AI is no longer just a back-end optimization tool. It is actively shaping how audiences encounter news, how trust is maintained, and how publishers survive. This episode offers a rare inside look at how a major aggregator is navigating those shifts thoughtfully, without racing ahead of the facts or sacrificing credibility. For media leaders, journalists, creators, and product teams, this conversation surfaces real-world lessons about where AI adds value, where human judgment remains essential, and why aggregation still plays a critical role in a healthy information ecosystem. What We Cover in This Episode 👇  • Why Yahoo still matters as a major gateway to news • How AI is reshaping content aggregation and personalization • Why audio is becoming a powerful habit-building news format • What Yahoo learned from integrating Artifact into its app • How AI summaries drive deeper engagement rather than replace it • Balancing speed, scale, and trust in AI-driven news products • How publishers and creators coexist inside Yahoo’s ecosystem • Why user behavior matters more than age or demographics • What an agent-driven web means for the future of news discovery 👤 Guest Kat Downs MulderGeneral Manager, Yahoo News : https://news.yahoo.com/  🔗 LinkedIn:https://www.linkedin.com/in/katdowns 🔗 X (formerly Twitter):https://x.com/katdowns  🔗 Yahoo Press Announcement:https://www.yahooinc.com/press/yahoo-appoints-kat-downs-mulder-as-svp-amp-general-manager-of-yahoo-news 🔗 Speaker Bio (Digital Content Next Summit):https://events.digitalcontentnext.org/next-summit-2023/speaker/636864/kat-downs-mulder 📩 Enjoyed this episode? Subscribe to The Media Copilot on Substack, Apple Podcasts, Spotify, or your favorite app. On YouTube?  Tap the Like button and Subscribe to the channel.  For more AI tools and resources built for media professionals, visit MediaCopilot.ai. Produced by Pete Pachal and Executive Producer Michele MussoEdited by the Musso Media Team  Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0 © 2025 Musso Media. All rights reserved. © AnyWho Media 2025

    41 min
  8. Best of the Year: Inside the AI Shift that’s Transforming Media and Journalism

    12/28/2025

    Best of the Year: Inside the AI Shift that’s Transforming Media and Journalism

    As the year wraps up, we’re taking a pause from weekly interviews to share a curated Best of the Year in AI. This special episode of The Media Copilot is a look back at the conversations that defined the past year...the questions, tensions, and turning points shaping how media, journalism, and technology intersect right now. Over the past year, Pete has spoken with some of the sharpest minds working at the center of AI, publishing, and platform design. And while the tools keep evolving, the same core questions kept resurfacing: How should creators and publishers be compensated in an AI-driven world?Where does transparency end and exploitation begin?Who actually controls the future of information, and who should? In this Best Of episode, you’ll hear standout moments from those conversations, including:  • How publishers are navigating AI licensing, attribution, and revenue • Why the rise of AI agents and scraping tools is forcing a rethink of digital rights • The growing tension between innovation and consent • What ethical AI actually looks like in practice, not theory • Why human judgment, context, and trust still matter more than ever From conversations with leaders at ProRata, Cloudflare, Taboola, Factiva, and more, this episode captures the real debates happening behind the scenes — beyond the headlines and hype. 🎙️ Featured Voices  Bill Gross – Founder & CEO, ProRataAnnelies Jansen – Chief Business Officer, ProRataMark Howard – Chief Operating Officer, Time (formerly Time Inc.)Adam Singolda – CEO, TaboolaToshit Panigrahi – CEO & Co-Founder, TolbitAurélie Guerrieri – Chief Marketing & Alliances Officer, CloudflareStephanie Cohen – Chief Strategy Officer, CloudflareMark Riley – Founder & CEO, Mathison AITraci Mabrey – General Manager, FactivaTrip Adler – Founder & CEO, Created by Humans If you care about the future of media, the economics of creativity, or how AI is reshaping who gets paid and who gets left behind, this one’s for you. 🎧 Listen now to The Media Copilot: Best of 2025 — and stay tuned for what’s next. 📩 Enjoyed this episode? Subscribe to The Media Copilot on Substack, Apple Podcasts, Spotify, or your favorite app. On YouTube?  Tap the Like button and Subscribe to the channel.  For more AI tools and resources built for media professionals, visit MediaCopilot.ai. Produced by Pete Pachal and Executive Producer Michele MussoEdited by the Musso Media Team  Music: “Favorite” by Alexander Nakarada, licensed under CC BY 4.0 © 2025 Musso Media. All rights reserved. © AnyWho Media 2025

    31 min

Ratings & Reviews

5
out of 5
5 Ratings

About

Hosted by journalist Pete Pachal, The Media Copilot is a weekly conversation with smart people on how AI is changing media, journalism, and the news.

You Might Also Like