Explores Recurrent Neural Networks for behavioral data, covering Deep Knowledge Tracing, LSTM, GRU networks, hyperparameter tuning, and time series prediction tasks.
Explores Seq2Seq models with and without attention mechanisms, covering encoder-decoder architecture, context vectors, decoding processes, and different types of attention mechanisms.
Explores deep learning for NLP, covering word embeddings, context representations, learning techniques, and challenges like vanishing gradients and ethical considerations.
Introduces feed-forward networks, covering neural network structure, training, activation functions, and optimization, with applications in forecasting and finance.
Explores the theoretical properties and practical power of Recurrent Neural Networks, including their relationship to state machines and Turing completeness.