The AI Download

Shira Lazar

– AI News with a Human Touch AI is transforming how we work, create, and connect—but what does that mean for the future of content, business, and creativity? Hosted by Emmy-nominated digital pioneer Shira Lazar, The AI Download is your go-to source for sharp insights, expert interviews, and real-world discussions on AI, emerging tech, and the creator economy. Each week, we break down the biggest AI trends, explore its impact on creativity and business, and bring you candid conversations with industry leaders, innovators, and disruptors shaping the future. Whether you're a creator, entrepreneur, or just AI-curious, this podcast deciphers what’s next—and why it matters. 🔹 New episodes every week 🔹 Featuring top experts & thought leaders 🔹 Where AI meets culture, creativity, and business Stay ahead of the curve! This episode of The AI Download is hosted, created, and executive produced by Shira Lazar. Executive production by Michele Musso, with video and audio editing by the Musso Media team. Creative Design Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Musso Media. © 2025 Unicorn Enterprises, Inc. All rights reserved.

  1. Work Slop, AI Parasites & Why Your Creative Instinct Still Matters (with Nate Jones)

    1D AGO

    Work Slop, AI Parasites & Why Your Creative Instinct Still Matters (with Nate Jones)

    AI isn’t replacing your job—it’s flooding it with noise. Shira Lazar and Nate Jones unpack how to protect your creativity in the AI era. This week on The AI Download, host Shira Lazar welcomes back creator and “AI explain-it-all” king Nate Jones for a high-speed, high-depth dive into AI’s cultural side effects: from “work slop” and productivity theater to chatbot obsessions and TikTok gurus poisoning your prompts. They unpack the rising tide of AI-generated noise—messy docs, shallow decks, pseudo-polished proposals—and why the real threat isn’t automation but creative laziness. They explore how tools like Claude and Perplexity are reshaping research and communication, and why taste, intent, and self-trust may be the ultimate AI-proof skills. Plus: Spotify’s crackdown on low-quality AI tracks, Nvidia’s $100B chip bet on OpenAI, and why Nate’s inbox has become a surreal fever dream of humans confessing their love to LLMs. Why This Matters: Everyone’s using AI—but not everyone’s using it well. As generative tools go mainstream, we risk drowning in “good enough” content that’s efficient but empty. This episode is a call to reclaim intent, sharpen instincts, and build work worth something. Whether you’re a founder, strategist, or solo creator, this is your roadmap to staying human in an automated world. What We Cover: “Work Slop” Defined – How AI-generated fluff clogs workflows and kills clarity Prompt Like You Mean It – Why vague intent = bad output Beware the Parasites – Culty AI talk tracks spreading like memes Spotify vs. the Slop – Why platforms are purging millions of AI tracks Nvidia x OpenAI – Inside the $100B chip deal shaping AI’s future Defend Your Flow – Protecting creativity from automation fatigue Delete the Dashes – The weird quirks of AI and keeping your voice Claude FTW – How Claude 4.1 is revolutionizing Excel & slides Gut Check Nation – Why instinct is your ultimate AI-proof skill Aim With Purpose – What “responsible AI” really means Key Takeaways: Don’t just automate—edit. AI is your intern, not your identity. “Slop” happens when you stop caring—set quality gates. Taste and trust are your competitive edge. Not every Reddit prompt deserves your time—think before you paste. Polished ≠ good. Your voice, vibe, and vision remain irreplaceable. 🎙 Guest: Nate Jones. ​​https://www.natebjones.com/  Creator of the “AI For Work” Substack https://natesnewsletter.substack.com/ TikTok/YouTube: @NateJonesKnown for making AI make sense (and calling out the BS) – 📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv  https://shiras-newsletter.beehiiv.com  Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar 🎬Visit mussomedia.com for storytelling that connects. SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai Credits This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.

    29 min
  2. Build Your Own GPT: The Playbook to Scale Creativity—Without Losing Your Voice

    SEP 25

    Build Your Own GPT: The Playbook to Scale Creativity—Without Losing Your Voice

    This week on The AI Download, host Shira Lazar and Jim Marsh pull back the curtain on the SAIL Framework—a step-by-step system to build your own custom GPT. Learn how Shira designed a GPT for What’s Trending, and how you can create one that scales your creativity without losing your voice. Jim Marsh is a former HBO executive and founder of JMC Strategic Intelligence, who helps leaders design practical AI systems that actually enhance creativity, strategy, and content. Together, they demo the framework, break down how it works day-to-day, and share a free worksheet so you can start building your own GPT.  Grab your free SAIL Worksheet here → shiralazar.com/SAIL 📌 Connect with Jim Marsh 🌐 Website: jmcstrategic.com 🔗 LinkedIn: linkedin.com/in/jimmarsh 🐦 Twitter/X: @JimMarshAI 📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv  https://shiras-newsletter.beehiiv.com  Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar 🎬Visit mussomedia.com for storytelling that connects. SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai Credits This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.

    46 min
  3. “Every AI Model Failed”: Sean Dadashi on Mental Health, Suicide Risk & the Future of Safe AI

    SEP 18

    “Every AI Model Failed”: Sean Dadashi on Mental Health, Suicide Risk & the Future of Safe AI

    Can AI replace therapy—or is it putting our mental health at risk? This week on The AI Download, we’re diving into one of the most sensitive frontiers in technology: AI and mental health. Shira Lazar is joined by Sean Dadashi, the Co-Founder of Rosebud, an interactive AI-powered journaling app designed to help you build self-awareness, track emotional patterns, and become your best self. But this isn’t just another feel-good AI tool—Rosebud is setting a new standard for ethics, safety, and intention in the wellness-tech space. What started as a personal passion project born from Sean’s own healing journey is now a fast-growing platform with serious backing including from Reddit co-founder Alexis Ohanian; and a user base seeking a better way to integrate tech into their mental health routines. Shira and Sean get into the deep stuff: → How Rosebud’s memory model works differently than ChatGPT or Claude → Why AI can’t (and shouldn’t) replace therapy but can be a helpful coach → What happens when someone journals about self-harm? → And what every major AI model got wrong when tested for crisis response. Yes, every model failed. That’s why Sean and his team built The CARE Benchmark, a new, open-source framework to test how AI models respond to suicide risk, psychosis, and isolation. Spoiler alert: even the best models today still get it dangerously wrong. They also talk privacy vs. intervention, addiction vs. support, and how Rosebud is intentionally limiting user engagement (even if it costs them revenue) in the name of long-term well-being. This is one of the most real conversations we’ve had yet about the human cost—and potential—of AI in our most vulnerable moments. What You’ll Learn: • Why “pattern recognition” is Rosebud’s superpower • The dark side of agreeable AI: psychosis, bias, and feedback loops • What happened in the case of Adam Reign—and how it changed everything • How Sean is pushing for a third-party standard to test AI model safety • The difference between coaching, therapy, and ethical AI boundaries • What makes Rosebud different—from memory to usage caps to bedtime check-ins • What’s coming next: biofeedback integrations, CARE 2.0, and year-end Wrapped reports About the Guest: Sean Dadashi is the CEO and co-founder of Rosebud, a purpose-built AI journaling app focused on helping users develop self-awareness and emotional clarity. His background spans cognitive science, tech innovation, and a lifelong passion for meditation, psychology, and human potential. Prior to Rosebud, Sean worked across product development and personal growth spaces and now leads the company’s mission to responsibly integrate AI with mental wellness. Follow & Subscribe 📲 Download Rosebud: https://www.rosebud.ai 💬 Connect with Sean on LinkedIn: https://www.linkedin.com/in/seandadashi 🧠 Learn more about the CARE Benchmark: https://www.rosebud.app/care  🎧 Subscribe for more conversations at the intersection of AI, ethics, and humanity. 📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv  https://shiras-newsletter.beehiiv.com  Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar 🎬Visit mussomedia.com for storytelling that connects. SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai Credits This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.

    43 min
  4. AI, Sex & the “Technosexual” Era: Kaamna Bhojwani on Love, Shame, and Robot Romance

    SEP 12

    AI, Sex & the “Technosexual” Era: Kaamna Bhojwani on Love, Shame, and Robot Romance

    AI boyfriends, VR bedrooms, and relationship “training wheels”... how AI is reshaping intimacy, why shame still runs the show, and how to use tech as a tool (not a trap). Dating apps felt disruptive—AI companions are a whole new planet. In this candid conversation, Kaamna Bhojwani - sexologist, Psychology Today columnist, host of Sex, Tech & Spirituality  explains what it means to be “technosexual,” why some people say an AI boyfriend is “better than nothing,” and where things tip into addiction, isolation, and bad incentives. We get practical too: using AI for sex education, role-playing tough conversations, closing the orgasm gap, and building the three C’s of lasting connection: communication, creativity, and—yes—cunnilingus. We also hit data privacy, attachment styles, teledildonics, VR, and the coming wave of humanoid robots. Tool, not replacement—that’s the mantra. What we cover: Technosexuality 101: integrating AI into personal and interpersonal life without losing the plot AI companions: who benefits, where harm starts, and why “better than nothing” is the current bar Shame, culture, and intimacy: how stigma shapes behavior—and how to unlearn it Education > Porn: using AI for safer, smarter sex ed with bias checks Tools & toys: VR fantasy spaces, teledildonics, and bio/AI devices e.g., Lioness that help you learn your body Communication boosters: rehearsal with AI role-play, prompts, then put the phone down and connect IRL Data privacy: what these apps collect—and why you should actually read the T&Cs Spirituality & self-knowledge: guarding the human parts algorithms can’t replace Guest Kaamna Bhojwani — Sexologist & author; Psychology Today columnist Becoming Technosexual; host, Sex, Tech & Spirituality IG: @kaamnalive • LinkedIn: Kaamna Bhojwani.  https://www.kaamnalive.com/  🎧 Love the episode?  Follow, rate & review The AI Download on Apple Podcasts, Spotify, or wherever you listen. 📺 Watching on YouTube?  Like, subscribe, and hit the bell for updates. 📩 Stay in the loop:  Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv  https://shiras-newsletter.beehiiv.com  Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar 🎬Visit mussomedia.com for storytelling that connects. SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai Credits This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.

    44 min
  5. Are We Raising AI Like Our Kids? With De Kai

    AUG 29

    Are We Raising AI Like Our Kids? With De Kai

    If AI are our artificial children, what kind of parents are we?  In this mind-expanding episode, Shira Lazar sits down with De Kai — cognitive scientist, musician, and one of the original architects behind Google Translate — to explore how today’s algorithms are shaping culture, emotion, and power... and how we might raise better machines before they raise us. As governments rush to regulate and tech giants push toward increasingly autonomous systems, Kai warns that the real crisis isn’t code—it’s character. With AI agents already influencing elections, mental health, and global conflict, he believes the stakes aren’t just technological, but moral. Can we teach empathy, curiosity, and cultural nuance to machines? Or are we too late? This episode unpacks: 🔹 Why De Kai calls AIs our “artificial children” — and what that metaphor reveals🔹 How AI’s emotional immaturity mirrors our worst online behaviors🔹 The hidden influence of algorithmic censorship and unconscious filtering🔹 Whether AIs should ever serve as therapists — and the risks of emotional dependence🔹 What it will take to build AIs that are mindful, not just predictive🔹 Why future AIs will raise the next generation of AIs — and why this generation is our last chance to get it right🔹 How music, cognitive science, and ethics inform De Kai’s AI philosophy🔹 Why energy use and “big dumb” transformer models could soon be replaced by leaner, smarter AI Whether you’re building tech or just living alongside it, this episode offers a powerful reframing of our role in the AI era. 🎙 Guest: De Kai :https://dek.ai/  📖 Author of Raising AI      🌐 dek.ai 🎧 Love the episode? Follow, rate & review The AI Download on Apple Podcasts, Spotify, or wherever you listen. 📺 Watching on YouTube? Like, subscribe, and hit the bell for updates. 📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv  https://shiras-newsletter.beehiiv.com  Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar 🎬Visit mussomedia.com for storytelling that connects. SPONSORS : 👉 HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai Credits This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Shira Lazar. All rights reserved.

    30 min
  6. From Hype to Reality: Investing in AI with Jeremiah Owyang

    AUG 21

    From Hype to Reality: Investing in AI with Jeremiah Owyang

    Billions are pouring into AI—but are the returns real? The hidden dynamics of investment, adoption, and ethics reveal how startups, creators, and enterprises can prepare for what’s next. This episode is brought to you by AI Venture Lab, powered by INSEAD — where intelligence takes shape.  AI is democratizing technology, but at what cost—and for whose benefit? Shira Lazar and Jeremiah Owyang tackle the core questions at the heart of today’s AI ecosystem: the speculation around AI investments, the survival strategies of startups, and the pressure on talent as companies race toward AGI. They unpack how business models, community, and ethics will determine who thrives, who fades, and how society adapts. This episode examines: 🔹 Why AI investments are booming despite uncertain payoffs 🔹 The rise of “lean AI startups” generating millions with minimal headcount 🔹 The fierce competition (and sky-high compensation) for top AI talent 🔹 The strategic importance of community and network effects for startups 🔹 Emerging ethical and regulatory debates shaping AI’s trajectory 🔹 How AI agents are redefining customer service and the future of work 🔹 Lessons from recent AI missteps—and what founders must learn fast Whether you’re an investor, founder, creator, or simply curious about how AI will shape your career and community, this episode offers both clarity and candor on the road ahead. 🎙 Guest: Jeremiah Owyang https://web-strategist.com/blog/ 📌 Venture Capitalist, Technologist & Community Builder 🌐 https://www.linkedin.com/in/jowyang  Check out Llama Lounge and attend an upcoming event in SF: https://lu.ma/llamalounge 🎧 Love the episode?  Follow, rate & review The AI Download on Apple Podcasts, Spotify, or wherever you listen. 📺 Watching on YouTube? Like, subscribe, and hit the bell for updates. 📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv  https://shiras-newsletter.beehiiv.com  Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar 🎬Visit mussomedia.com for storytelling that connects. SPONSORS :   AI Venture Lab Powered by INSEAD The AI Venture Lab is redefining how startups are built in the age of AI. Their newly launched AI Founder Sprint 2025 has already drawn over 1,000 applications from founders across 90+ countries. What makes the sprint different? It’s equity-free, cost-free, and fully global — offering access to world-class mentorship, research-backed frameworks, and opportunities to showcase at Global Demo Days in Singapore and Abu Dhabi. However, this is just the beginning - the future of venture building is unfolding now. Follow the AI Venture Lab on LinkedIn and Instagram to see how founders worldwide are rewriting the rules with AI, and tap into the frameworks, insights, and stories shaping it. Also 👉 Special Thanks to Sponsors: HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai Credits This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Musso Media. © 2025 Shira Lazar. All rights reserved.

    32 min
  7. How To ACTUALLY Use AI to Grow Your Business

    AUG 14

    How To ACTUALLY Use AI to Grow Your Business

    Why the next wave of AI will reward precision and purpose—and how to maintain control as the tools grow more capable. 👉 Special Thanks to Our Sponsors: HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai On this episode of The AI Download, Shira Lazar speaks with Shelly Palmer—futurist, author, and CEO of The Palmer Group—about the realities and opportunities in today’s AI landscape. With OpenAI’s GPT-5 now available to millions and AI agents emerging as a transformative force, Palmer provides a clear, disciplined perspective on how to approach these tools strategically. Palmer emphasizes starting with a defined objective—whether in business, creative work, or research—before selecting and applying AI solutions. He examines the operational and security considerations of deploying agents, the structural changes from search engines to “answer engines,” and the importance of governance in preventing unintended access or misuse. This episode examines: 🔹 What sets GPT-5 apart in performance, reasoning, and safety measures 🔹 How to choose the right model for specific writing, coding, or reasoning needs 🔹 Governance frameworks and permissions to protect sensitive data 🔹 The economic and strategic impact of shifting from search to direct answers 🔹 Effective approaches for individuals and teams to experiment with AI safely 🔹 Persistent misconceptions about AI capabilities and limitations in 2025 🔹 Why defined goals and measurable outcomes matter more than technical novelty Whether you’re leading an enterprise, scaling a startup, or exploring AI’s potential in your own work, this conversation delivers a pragmatic framework for using these technologies with intent, accuracy, and control. 🎙 Guest: Shelly Palmer 📚 Author, Speaker & CEO of The Palmer Group       Linkedin        Newsletter  📲 Follow: @ShellyPalmer on X, LinkedIn, and more 🌐 Learn more: https://www.shellypalmer.com  🎧 Love the episode? Follow, rate & review The AI Download on Apple Podcasts, Spotify, or wherever you listen. 📺 Watching on YouTube? Like, subscribe, and hit the bell for updates. 📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv  https://shiras-newsletter.beehiiv.com  Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar 🎬Visit mussomedia.com for storytelling that connects. Sponsored by: Thanks to our sponsor, Hivelighter  —the AI-powered tool that supercharges your reading and research with personalized insights. Want to dive deeper? Check out The AI Download Collection, where we’ve curated the top highlights from our biggest AI stories. Just click this Hivelighter link to explore more. Credits This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso. Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Musso Media. © 2025 Shira Lazar. All rights reserved.

    50 min
  8. AI Bias, Community Power & Raising Critical Thinkers with Dr. Avriel Epps

    JUL 31

    AI Bias, Community Power & Raising Critical Thinkers with Dr. Avriel Epps

    Researcher and advocate Dr. Avriel Epps joins Shira Lazar to unpack the deep-rooted biases embedded in AI—and why our social systems are the real algorithm. From selfie filters that whitewash to surveillance tools that target, Dr. Epps breaks down how technology reinforces existing injustice, and what we can do to build tools that heal instead of harm. 👉 Special Thanks to Our Sponsors: HiveLighter – Your AI reading assistant for smart, personalized summaries. Explore our curated AI Download Collection: https://www.hivelighter.ai On this episode of The AI Download, Shira Lazar sits down with Dr. Avriel Epps—data scientist, developmental psychologist, and founder of AI for Abolition—for a powerful conversation about how bias seeps into machine learning systems and what we must do to resist, reshape, and rebuild the future of tech. Dr. Epps doesn’t just expose the harm—she offers a framework for healing. From her kids’ book on AI bias to tools for radical community engagement, she shows how literacy, organizing, and emotional self-awareness are the real defense against a fractured AI future. This episode dives into: 🔹 Why algorithmic bias is not just technical—it's historical 🔹 How social media and recommendation engines amplify inequality 🔹 Why “fixing the data” isn’t enough—we need to fix the incentives 🔹 The hidden harms of selfie filters, AI influencers, and convenience culture 🔹 Why community organizers, technologists, and social theorists must co-create 🔹 Her vision for “data labor unions” and rethinking digital compensation 🔹 What we all can do—parents, educators, kids—to reclaim power in a tech-driven world Whether you’re building AI, raising kids, or just trying to stay conscious in the age of the algorithm, this episode is a masterclass in awareness, equity, and how to lead with heart in tech. 🎙 Guest: Dr. Avriel Epps 📚 Author of A Kid’s Book About AI Bias 🎓 Founder of AI for Abolition 📲 Follow: @KingAvriel on TikTok, IG, and more 🌐 Learn more: https://www.akidsco.com/products/a-kids-book-about-ai-bias 🎧 Love the episode? Follow, rate & review The AI Download on Apple Podcasts, Spotify, or wherever you listen. 📺 Watching on YouTube? Like, subscribe, and hit the bell for updates. 📩 Stay in the loop: Subscribe to Shira’s newsletter → Shira’s Newsletter on Beehiiv  https://shiras-newsletter.beehiiv.com  Follow Shira: x.com/shiralazar instagram.com/shiralazar tiktok.com/@shiralazar linkedin.com/in/shiralazar youtube.com/shiralazar 🎬Visit mussomedia.com for storytelling that connects. Sponsored by: Thanks to our sponsor, Hivelighter  —the AI-powered tool that supercharges your reading and research with personalized insights. Want to dive deeper? Check out The AI Download Collection, where we’ve curated the top highlights from our biggest AI stories. Just click this Hivelighter link to explore more. Credits This episode of The AI Download was hosted, created, and executive produced by Shira Lazar. Executive Producer Michele Musso, with video and audio editing by the Musso Media team. Creative Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Musso Media. © 2025 Shira Lazar. All rights reserved.

    43 min

Trailer

4.5
out of 5
8 Ratings

About

– AI News with a Human Touch AI is transforming how we work, create, and connect—but what does that mean for the future of content, business, and creativity? Hosted by Emmy-nominated digital pioneer Shira Lazar, The AI Download is your go-to source for sharp insights, expert interviews, and real-world discussions on AI, emerging tech, and the creator economy. Each week, we break down the biggest AI trends, explore its impact on creativity and business, and bring you candid conversations with industry leaders, innovators, and disruptors shaping the future. Whether you're a creator, entrepreneur, or just AI-curious, this podcast deciphers what’s next—and why it matters. 🔹 New episodes every week 🔹 Featuring top experts & thought leaders 🔹 Where AI meets culture, creativity, and business Stay ahead of the curve! This episode of The AI Download is hosted, created, and executive produced by Shira Lazar. Executive production by Michele Musso, with video and audio editing by the Musso Media team. Creative Design Director Nadia Giosia with Mint Labs. Music by PALA, Catalina Coastline (licensed under Boss Soundstripe Productions by BMI). Produced by Musso Media. © 2025 Musso Media. © 2025 Unicorn Enterprises, Inc. All rights reserved.

You Might Also Like