L05 : Recurrent Neural Networks and Transformers: Sequence to Sequence Learning, RNNs and LSTMs

Lecture Goals

  • Know when prediction tasks can have sequential dependencies
  • The RNN architecture and unfolding
  • Know how LSTMs work
  • Applications of ‘sequential to sequential’ models