Jason’s Notes

Deep Learning

  1. Logistic regression
  2. Backprop for logistic regression
  3. Backprop for a two-layer neural net
  4. Deriving backprop
  5. A pure numpy neural net in less than 200 lines (code)
  6. Regularization, activation functions, initialization, optimization, batch norm
  7. Convolutional neural nets
  8. Recurrent neural nets

NLP

  1. Word embeddings
  2. Attention
  3. Transformers
  4. BERT

Machine Learning

  1. Decision trees and random forests
  2. Support vector machines and kernels
  3. k-means

Computer Science

To do

  1. XLNet
  2. BART
  3. SimCLR

Most of my ML/DL notes are from Andrew Ng’s Deep Learning and Machine Learning Coursera courses. I also really liked Hands-On Machine Learning with Scikit-Learn and Tensorflow by Aurélien Géron.

My NLP notes are based on content from Andrew Ng, Graham Neubig, papers, and YouTube explanations.

People to follow

  1. Anything written by Sam Greydanus
  2. Nanyun Peng’s work on creative language generation
  3. Work from Ryan Cotterell’s lab
  4. Tim Althoff’s work on mental health
  5. Jacob Andreas
  6. Robin Jia