16 min

Alternate Histories and GPT-3 Riskgaming

    • Technology

"GPT-3 was trained on is so large that the model contains a certain  fraction of the actual complexity of the world. But how much is actually  inside these models, implicitly embedded within these neural networks?

I  decided to test this and see if I could examine the GPT-3 model of the world through the use of counterfactuals. Specifically, I wanted to see if GPT-3 could productively unspool histories of the world if things were slightly different, such as if the outcome of a war were different or a historical figure hadn’t been born. I wanted to see how well it could write alternate histories." - Samuel Arbesman

From Cabinet of Wonders newsletter by Samuel Arbesman

Great tweet thread summarizing his post

"Securities" podcast is produced and edited by Chris Gates

"GPT-3 was trained on is so large that the model contains a certain  fraction of the actual complexity of the world. But how much is actually  inside these models, implicitly embedded within these neural networks?

I  decided to test this and see if I could examine the GPT-3 model of the world through the use of counterfactuals. Specifically, I wanted to see if GPT-3 could productively unspool histories of the world if things were slightly different, such as if the outcome of a war were different or a historical figure hadn’t been born. I wanted to see how well it could write alternate histories." - Samuel Arbesman

From Cabinet of Wonders newsletter by Samuel Arbesman

Great tweet thread summarizing his post

"Securities" podcast is produced and edited by Chris Gates

16 min

Top Podcasts In Technology

Acquired
Ben Gilbert and David Rosenthal
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Hard Fork
The New York Times
Lex Fridman Podcast
Lex Fridman
TED Radio Hour
NPR
Darknet Diaries
Jack Rhysider