The Rip Current with Jacob Ward

Jacob Ward

The Rip Current covers the big, invisible forces carrying us out to sea, from tech to politics to greed to beauty to culture to human weirdness. The currents are strong, but with a little practice we can learn to spot them from the beach, and get across them safely. Veteran journalist Jacob Ward has covered technology, science and business for NBC News, CNN, PBS, and Al Jazeera. He's written for The New Yorker, The New York Times Magazine, Wired, and is the former Editor in Chief of Popular Science magazine.

  1. 5/03

    Lethal Beta: Gaza Was the Test Run. Iran Is the Launch.

    For decades, Israel has deployed new weapons systems in Gaza, collected data on what works, refined them, and sold the results internationally as “battle-tested” technology. That pipeline is now running on AI. A system called Lavender assigned kill ratings to 37,000 Palestinians. Operators approved strikes in around 20 seconds. The accepted error rate was 10 percent. Before AI targeting, Israeli analysts produced around 50 verified targets per year. After: up to 250 strikes per day. Gaza was the beta test. This week’s strikes on Iran are the product launch. In this video, I’m coining a term for this dynamic — lethal beta — and tracing the full pipeline: from the AI systems deployed in Gaza, to the arms companies now filing for IPOs on the back of 241% revenue growth, to the Pentagon official who called Ukraine “an extraordinary laboratory” for military AI, to the ways the same logic is now normalizing autonomous warfare at a scale none of the original systems were designed for. As always, this isn’t just a story about technology. It’s a story about who makes decisions, who profits, and who pays. Paid subscribers can read the full written analysis here: Further Reading: “’Lavender’: The AI Machine Directing Israel’s Bombing Spree in Gaza” — +972 Magazine, April 3, 2024: https://www.972mag.com/lavender-ai-israeli-army-gaza/ “Dirty Secret of Israel’s Weapons Exports: They’re Tested on Palestinians” — Al Jazeera, November 17, 2023: https://www.aljazeera.com/features/2023/11/17/israels-weapons-industry-is-the-gaza-war-its-latest-test-lab The Palestine Laboratory — Antony Loewenstein, Verso Books, 2023: https://www.amazon.com/Palestine-Laboratory-Exports-Technology-Occupation/dp/1839762217 “The Palestine Laboratory” (Documentary) — Al Jazeera English, January–February 2025: https://network.aljazeera.net/en/press-releases/%E2%80%98-palestine-laboratory%E2%80%99-exposes-israel%E2%80%99s-export-unique-systems-control-and “Gaza: Israel’s AI Human Laboratory” — The Cairo Review of Global Affairs, June 12, 2025: https://www.thecairoreview.com/essays/gaza-israels-ai-human-laboratory/ “When AI Decides Who Lives and Dies” — Foreign Policy, May 2, 2024: https://foreignpolicy.com/2024/05/02/israel-military-artificial-intelligence-targeting-hamas-gaza-deaths-lavender/ “The Cruel Experiments of Israel’s Arms Industry” — Pulitzer Center: https://pulitzercenter.org/stories/cruel-experiments-israels-arms-industry “The Genocide Will Be Automated — Israel, AI and the Future of War” — MERIP, October 2024: https://www.merip.org/2024/10/the-genocide-will-be-automated-israel-ai-and-the-future-of-war/ “War Rewrote the Rules: The World Studies Israel’s AI-Driven Battlefield Playbook” — Ynet News, February 2026: https://www.ynetnews.com/tech-and-digital/article/bjfoec900wl “Artificial Intelligence on the Battlefield in 2025” — The Jerusalem Post: https://www.jpost.com/defense-and-tech/article-861611 “Ukraine Is an ‘Extraordinary Laboratory’ for Military AI” — DefenseScoop, August 1, 2023: https://defensescoop.com/2023/08/01/ukraine-is-extraordinary-laboratory-for-military-ai-senior-dod-official-says/ “The Horrifying, AI-Enhanced Future of War Is Here” — The New Republic, November 2025: https://newrepublic.com/article/202753/ukraine-drones-ai-enhanced-future-war “Governing AI Under Fire in Ukraine” — The Cairo Review of Global Affairs, June 15, 2025: https://www.thecairoreview.com/essays/governing-ai-under-fire-in-ukraine/ “Battlefield Drones and the Accelerating Autonomous Arms Race in Ukraine” — Modern War Institute, West Point, January 10, 2025: https://mwi.westpoint.edu/battlefield-drones-and-the-accelerating-autonomous-arms-race/

    14 min
  2. 26/02

    A Big Week for Tech Accountability

    Last week I watched what may be the Big Tobacco moment for social media unfold in real time. The trial against Meta in Los Angeles is the first of an estimated 1,600 cases making a specific argument: that Section 230 doesn't protect a platform that deliberately engineered addictive behavior. Internal company documents — showing what these companies knew about harm and when — are entering the court record. And then Mark Zuckerberg got served with legal papers walking into court. We don't know what lawsuit yet. But the image says everything. On Friday I got into a public debate with Taylor Lorenz about whether the social media threat to young people is a moral panic or something genuinely new. We disagree. I think the internal documents coming out of these companies make the moral panic framing harder to sustain — when a company's own researchers document harm and management keeps optimizing for engagement, that's not cultural overreaction, that's a paper trail. Then this morning the Anthropic story broke. The Pentagon summoned CEO Dario Amodei and told him to drop his internal ethics restrictions on autonomous weapons and mass surveillance, or lose the contract. Amodei published an 80-page AI constitution last month and a 20,000-word warning essay this year. He named the trap he was worried about. Now he's in it. The full analysis is at The Rip Current. Paid subscribers get early access + full transcripts: https://theripcurrent.com

    9 min
  3. 20/02

    Zuckerberg's Testimony: What to Watch For

    For the first time in his life, Mark Zuckerberg will answer questions under oath — not to a Senate subcommittee where politicians perform for their clips, but to a jury of regular people whose only job is to decide whether he's telling the truth. This is a genuinely different situation, and here's how to watch it. The real danger for Zuckerberg isn't his testimony — it's the internal documents already in evidence that will be put in front of him. A 2018 Meta strategy document saying "if we want to win big with teens, we must bring them in as tweens." Emails from Meta's own tech chief reporting back to Zuckerberg about plastic surgery filters, with Zuckerberg's response being that he needed "more data" before acting on known harm. Internal communications in which Meta employees referred to themselves as "basically pushers." These don't sound like a company run by a thoughtful parent. The third thing to watch is Section 230 — the 1996 law that gives platforms blanket immunity for what users post. The plaintiffs' argument, which the judge has already allowed the jury to consider, is that this trial isn't about content. It's about design. Infinite scroll. Autoplay. The Like button. If design liability succeeds here, it blows a hole in the legal shield that has protected every major platform for decades. The question at the heart of this trial — who makes these decisions, who profits, and who ends up paying — is one I've been covering for years. Wednesday gives us a jury's answer. Originally published at The Rip Current. Paid subscribers get early access + full transcripts: https://theripcurrent.substack.com

    11 min
  4. 19/02

    The Social Media Trial Explained

    A 20-year-old woman started using YouTube at age six and Instagram at age nine. She's now suing both companies, and her case has just become the most important tech trial since the DOJ went after Microsoft in 1998. Here's what's actually at stake — and why it matters whether you're a parent or not. The trial isn't just about one person's mental health. It's a bellwether case for more than 1,500 similar lawsuits waiting in the pipeline, and the first time CEOs of major social media platforms — including Mark Zuckerberg, who testifies this week — have had to answer questions in front of a jury rather than a Senate subcommittee. The internal documents already in evidence are extraordinary: YouTube memos describing "viewer addiction" as a goal, Meta's Project Myst finding that traumatized kids were especially vulnerable to the platform and that parental controls made almost no difference, and a strategy document laying out a pipeline designed to bring kids in as tweens and keep them as teens. The central legal question is whether Section 230 — the 1996 law that has shielded every major platform from liability for nearly 30 years — protects design decisions like infinite scroll, autoplay, and the Like button. The judge has already ruled that the jury can consider design liability. If that argument wins, it changes the legal landscape for every platform that has ever made an engineering choice optimized for engagement. Nobody voted on infinite scroll. No regulator approved autoplay. A small group of engineers and executives made those decisions, and billions of people — including six-year-olds — inherited the results. A Los Angeles jury is now being asked to weigh in on that. Originally published at The Rip Current. Paid subscribers get early access + full transcripts: https://theripcurrent.substack.com

    11 min

Sobre

The Rip Current covers the big, invisible forces carrying us out to sea, from tech to politics to greed to beauty to culture to human weirdness. The currents are strong, but with a little practice we can learn to spot them from the beach, and get across them safely. Veteran journalist Jacob Ward has covered technology, science and business for NBC News, CNN, PBS, and Al Jazeera. He's written for The New Yorker, The New York Times Magazine, Wired, and is the former Editor in Chief of Popular Science magazine.

Talvez também goste