1 hr 46 min

Ryan Tibshirani: Statistics, Nonparametric Regression, Conformal Prediction The Gradient: Perspectives on AI

    • Technology

Episode 121
I spoke with Professor Ryan Tibshirani about:
* Differences between the ML and statistics communities in scholarship, terminology, and other areas.
* Trend filtering
* Why you can’t just use garbage prediction functions when doing conformal prediction
Ryan is a Professor in the Department of Statistics at UC Berkeley. He is also a Principal Investigator in the Delphi group. From 2011-2022, he was a faculty member in Statistics and Machine Learning at Carnegie Mellon University. From 2007-2011, he did his Ph.D. in Statistics at Stanford University.
Reach me at editor@thegradient.pub for feedback, ideas, guest suggestions.
The Gradient Podcast on: Apple Podcasts  | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter
Outline:
* (00:00) Intro
* (01:10) Ryan’s background and path into statistics
* (07:00) Cultivating taste as a researcher
* (11:00) Conversations within the statistics community
* (18:30) Use of terms, disagreements over stability and definitions
* (23:05) Nonparametric Regression
* (23:55) Background on trend filtering
* (33:48) Analysis and synthesis frameworks in problem formulation
* (39:45) Neural networks as a specific take on synthesis
* (40:55) Divided differences, falling factorials, and discrete splines
* (41:55) Motivations and background
* (48:07) Divided differences vs. derivatives, approximation and efficiency
* (51:40) Conformal prediction
* (52:40) Motivations
* (1:10:20) Probabilistic guarantees in conformal prediction, choice of predictors
* (1:14:25) Assumptions: i.i.d. and exchangeability — conformal prediction beyond exchangeability
* (1:25:00) Next directions
* (1:28:12) Epidemic forecasting — COVID-19 impact and trends survey
* (1:29:10) Survey methodology
* (1:38:20) Data defect correlation and its limitations for characterizing datasets
* (1:46:14) Outro
Links:
* Ryan’s homepage
* Works read/mentioned
* Nonparametric Regression
* Adaptive Piecewise Polynomial Estimation via Trend Filtering (2014) 
* Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems (2020)
* Distribution-free Inference
* Distribution-Free Predictive Inference for Regression (2017)
* Conformal Prediction Under Covariate Shift (2019)
* Conformal Prediction Beyond Exchangeability (2023)
* Delphi and COVID-19 research
* Flexible Modeling of Epidemics
* Real-Time Estimation of COVID-19 Infections
* The US COVID-19 Trends and Impact Survey and Big data, big problems: Responding to “Are we there yet?”



Get full access to The Gradient at thegradientpub.substack.com/subscribe

Episode 121
I spoke with Professor Ryan Tibshirani about:
* Differences between the ML and statistics communities in scholarship, terminology, and other areas.
* Trend filtering
* Why you can’t just use garbage prediction functions when doing conformal prediction
Ryan is a Professor in the Department of Statistics at UC Berkeley. He is also a Principal Investigator in the Delphi group. From 2011-2022, he was a faculty member in Statistics and Machine Learning at Carnegie Mellon University. From 2007-2011, he did his Ph.D. in Statistics at Stanford University.
Reach me at editor@thegradient.pub for feedback, ideas, guest suggestions.
The Gradient Podcast on: Apple Podcasts  | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter
Outline:
* (00:00) Intro
* (01:10) Ryan’s background and path into statistics
* (07:00) Cultivating taste as a researcher
* (11:00) Conversations within the statistics community
* (18:30) Use of terms, disagreements over stability and definitions
* (23:05) Nonparametric Regression
* (23:55) Background on trend filtering
* (33:48) Analysis and synthesis frameworks in problem formulation
* (39:45) Neural networks as a specific take on synthesis
* (40:55) Divided differences, falling factorials, and discrete splines
* (41:55) Motivations and background
* (48:07) Divided differences vs. derivatives, approximation and efficiency
* (51:40) Conformal prediction
* (52:40) Motivations
* (1:10:20) Probabilistic guarantees in conformal prediction, choice of predictors
* (1:14:25) Assumptions: i.i.d. and exchangeability — conformal prediction beyond exchangeability
* (1:25:00) Next directions
* (1:28:12) Epidemic forecasting — COVID-19 impact and trends survey
* (1:29:10) Survey methodology
* (1:38:20) Data defect correlation and its limitations for characterizing datasets
* (1:46:14) Outro
Links:
* Ryan’s homepage
* Works read/mentioned
* Nonparametric Regression
* Adaptive Piecewise Polynomial Estimation via Trend Filtering (2014) 
* Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems (2020)
* Distribution-free Inference
* Distribution-Free Predictive Inference for Regression (2017)
* Conformal Prediction Under Covariate Shift (2019)
* Conformal Prediction Beyond Exchangeability (2023)
* Delphi and COVID-19 research
* Flexible Modeling of Epidemics
* Real-Time Estimation of COVID-19 Infections
* The US COVID-19 Trends and Impact Survey and Big data, big problems: Responding to “Are we there yet?”



Get full access to The Gradient at thegradientpub.substack.com/subscribe

1 hr 46 min

Top Podcasts In Technology

No Priors: Artificial Intelligence | Technology | Startups
Conviction | Pod People
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Lex Fridman Podcast
Lex Fridman
Acquired
Ben Gilbert and David Rosenthal
TED Radio Hour
NPR
Hard Fork
The New York Times