The Privacy Partnership Podcast with Robert Bateman

treborjnametab1

Robert Bateman provides the latest on data protection and privacy, with regular solo news updates and short-form interviews. Brought to you by Privacy Partnership: www.privacypartnership.com

  1. APR 21

    What actually counts as 'scientific research'? Here's the EDPB's six-point answer

    On 15 April 2026, the European Data Protection Board adopted Guidelines 1/2026 on the processing of personal data for scientific research purposes. The 66-page document is now out for public consultation. In this episode, Robert Bateman breaks down what the guidelines mean for pharma companies, AI developers, universities, and anyone relying on the GDPR's scientific research provisions. The GDPR gives scientific research significant special treatment — a presumption of compatibility for further processing, extended storage, broad consent, carve-outs from the right to erasure, and a narrower right to object. But to access those provisions, you first need to qualify as "scientific research" in the first place. In this episode: The EDPB's six-factor test for determining whether processing qualifies as scientific researchWhy a for-profit AI start-up can qualify — but retail analytics can'tWhat "broad consent" actually means, and how it differs from "dynamic consent"The high threshold for the "manifestly made public" exception after Schrems (October 2024)When "covert research" is permitted under Article 14(5)(b)How the guidelines sit alongside the Digital Omnibus and the European Biotech ActUseful references: EDPB Guidelines 1/2026 (public consultation draft)CJEU Case C-446/21 — Schrems v Meta Platforms Ireland (4 October 2024)Articles 5(1)(b), 9(2)(e), 14(5)(b), 17(3)(d), 21(6), and 89 GDPRConsultation: open now on the EDPB website. Host: Robert Bateman, Senior Partner at Privacy Partnership Get in touch if your organisation needs support with GDPR compliance for research activities.

    6 min
  2. APR 7

    AI in recruitment: ICO highlights poor practices as UK overhauls automated decision-making rules

    Are your hiring managers quietly letting an algorithm bin hundreds of job applications while claiming a human is technically in charge? This week on the Privacy Partnership Podcast, Rob unpacks a massive structural shift in the UK’s framework for Automated Decision-Making (ADM). We dive into two major new releases from the ICO: the highly revealing Recruitment Rewired report and the newly updated draft guidance on ADM and profiling. With the Data (Use and Access) Act (DUAA) taking effect, the UK GDPR’s approach to ADM has fundamentally changed—moving from a strict "prohibition with exceptions" to a more flexible "right of challenge with safeguards." Robert explains why this is arguably the most significant change under the DUAA, how it actually reduces friction for controllers by opening up Legitimate Interests as a lawful basis, and why the compliance burden hasn't disappeared, but rather shifted. We also look at where companies are still getting this horribly wrong. Although the ICO's Recruitment Rewired report covers a period before the DUAA took effect, the new draft guidance makes clear that the new Article 22C safeguards essentially codify the old rules. If you were failing then, you are failing now. In This Episode, We Cover: The DUAA ADM Overhaul: How Articles 22A-22D change the game for controllers, making it easier to deploy AI decision-making without relying on clumsy lawful bases. The "Meaningful Human Involvement" Trap: Why having a human "rubber-stamp" an AI's red-light rejection score is still a solely automated decision under the law. Lawful Basis Headaches: Why Consent and Contract are terrible fits for automated CV screening, and how Legitimate Interests (and the required LIA) is now the clear path forward. Transparency & DPIA Failures: A look at the worst practices the ICO found, including vague privacy notices, missing safeguards, and a solo legal team member signing off on a DPIA without consulting the DPO. Key Quotes: "The DUAA has undeniably made it easier to justify rolling out automated decision-making systems... But the structural requirements for fairness, transparency, and human intervention haven't vanished—they've just been recodified." "If a human is simply applying the outcome of an automated system without actively evaluating the person's information, that is not meaningful human involvement." Resources & Links: Read the ICO’s Draft Guidance on Automated Decision-Making, including profiling: [Link to ICO Website] Read the ICO’s Recruitment Rewired Report: [Link to ICO Website] Learn more about the Data (Use and Access) Act (DUAA) changes to the UK GDPR. About the Host: Robert Bateman is a privacy expert, analyst, and the host of the Privacy Partnership Podcast. Subscribe & Review: If you enjoyed this episode, please subscribe to the Privacy Partnership Podcast on Apple Podcasts, Spotify, or your favorite podcast app. Leave us a rating and review to help other privacy professionals find the show!

    6 min
  3. MAR 5

    Are you a 'data broker'? Maybe, under the EDPB’s expanding definition

    Are you a data broker? You might not think so, but European regulators could soon be looking at your business model and concluding otherwise. In this episode of the Privacy Partnership Podcast, Robert Bateman breaks down a revealing market study commissioned by the Belgian Data Protection Authority through the EDPB’s Support Pool of Experts. Designed to identify and map the data broker ecosystem, the report provides a fascinating look at how the regulatory definition of data brokerage is expanding far beyond the traditional back-room list sellers of the early internet. We explore how this behavior-based European definition sharply contrasts with privity-focused frameworks like California’s Delete Act. We also dive into the study's eight-part typology, which sweeps "privacy-preserving" Data Clean Rooms, AI platforms trained on scraped data, and even B2B contact databases under the data broker umbrella. If your organisation ingests data, mixes it with other datasets, and monetises the insights, this is an episode you need to hear. In this episode, we cover: [0:00] Introduction: A look at the new EDPB-commissioned market study aiming to map the data broker ecosystem. [0:45] The Definition & The California Contrast: How the European focus on behavior, profiling, and lack of "meaningful control" differs from the structural "direct relationship" test found in California's CCPA. [2:15] High-Risk Typologies: Why adtech’s beloved Data Clean Rooms and AI integration platforms are being classified as high-risk data brokers. [3:45] Medium-Risk Categories: The regulatory perspective on aggregated spatial data, mobility trends, and B2B contact lists, and where the risk of re-identification allegedly creeps back in. [4:10] Outro: Key takeaways for privacy professionals evaluating their own data supply chains and partnerships.

    6 min

About

Robert Bateman provides the latest on data protection and privacy, with regular solo news updates and short-form interviews. Brought to you by Privacy Partnership: www.privacypartnership.com

You Might Also Like