Illya’s 30u30 notes

Apparently this is a list of papers that Illya Sutskevar gave carmack. Like who knows if its true. But a student has to start somewhere.

  1. Attention is All You Need [paper] [notes]
  2. The First Law of Complexodynamics[link] [notes]
  3. Effectiveness of RNNs
  4. Understanding LSTM Networks
  5. Deep Residual Learning
  6. GPipe
  7. ImageNet with Deep ConvNets
  8. Neural Machine Translation
  9. Neural Message Passing for Quantum Chemistry
  10. Pointer Networks
  11. Seq2Seq for Sets
  12. Scaling Laws for Neural LLMs
  13. Identity Mappings in Deep Residual Networks
  14. The Annotated Transformer
  15. Recurrent Neural Network Regularization
  16. Keeping Neural Networks Simple by Minimizing the Description Length of the Weights
  17. Multi-Scale Context Aggregation by Dilated Convolutions
  18. A Simple NN Module for Relational Reasoning
  19. Variational Lossy Autoencoder
  20. Relational RNNs
  21. Quantifying the Rise and Fall of Complexity in Closed Systems: The Coffee Automaton
  22. Neural Turing Machines
  23. Deep Speech 2: End-to-End Speech Recognition in English and Mandarin
  24. A Tutorial Introduction to the Minimum Description Length Principle
  25. Machine Super Intelligence Dissertation
  26. PAGE 434 onwards: Komogrov Complexity
  27. CS231n Convolutional Neural Networks for Visual Recognition