Illya’s 30u30 notes
Apparently this is a list of papers that Illya Sutskevar gave carmack. Like who knows if its true. But a student has to start somewhere.
- Attention is All You Need [paper] [notes]
- The First Law of Complexodynamics[link] [notes]
- Effectiveness of RNNs
- Understanding LSTM Networks
- Deep Residual Learning
- GPipe
- ImageNet with Deep ConvNets
- Neural Machine Translation
- Neural Message Passing for Quantum Chemistry
- Pointer Networks
- Seq2Seq for Sets
- Scaling Laws for Neural LLMs
- Identity Mappings in Deep Residual Networks
- The Annotated Transformer
- Recurrent Neural Network Regularization
- Keeping Neural Networks Simple by Minimizing the Description Length of the Weights
- Multi-Scale Context Aggregation by Dilated Convolutions
- A Simple NN Module for Relational Reasoning
- Variational Lossy Autoencoder
- Relational RNNs
- Quantifying the Rise and Fall of Complexity in Closed Systems: The Coffee Automaton
- Neural Turing Machines
- Deep Speech 2: End-to-End Speech Recognition in English and Mandarin
- A Tutorial Introduction to the Minimum Description Length Principle
- Machine Super Intelligence Dissertation
- PAGE 434 onwards: Komogrov Complexity
- CS231n Convolutional Neural Networks for Visual Recognition