PyTorch Developer Podcast

DataLoader with multiple workers leaks memory

Today I'm going to talk about a famous issue in PyTorch, DataLoader with num_workers > 0 causes memory leak (https://github.com/pytorch/pytorch/issues/13246). This bug is a good opportunity to talk about DataSet/DataLoader design in PyTorch, fork and copy-on-write memory in Linux and Python reference counting; you have to know about all of these things to understand why this bug occurs, but once you do, it also explains why the workarounds help.

Further reading.

  • A nice summary of the full issue https://github.com/pytorch/pytorch/issues/13246#issuecomment-905703662
  • DataLoader architecture RFC https://github.com/pytorch/pytorch/issues/49440
  • Cinder Python https://github.com/facebookincubator/cinder