Jason’s Notes
Deep Learning
- Logistic regression
- Backprop for logistic regression
- Backprop for a two-layer neural net
- Deriving backprop
- A pure numpy neural net in less than 200 lines (code)
- Regularization, activation functions, initialization, optimization, batch norm
- Convolutional neural nets
- Recurrent neural nets
NLP
Machine Learning
Computer Science
To do
- XLNet
- BART
- SimCLR
Most of my ML/DL notes are from Andrew Ng’s Deep Learning and Machine Learning Coursera courses. I also really liked Hands-On Machine Learning with Scikit-Learn and Tensorflow by Aurélien Géron.
My NLP notes are based on content from Andrew Ng, Graham Neubig, papers, and YouTube explanations.
People to follow
- Anything written by Sam Greydanus
- Nanyun Peng’s work on creative language generation
- Work from Ryan Cotterell’s lab
- Tim Althoff’s work on mental health
- Jacob Andreas
- Robin Jia