Covers the foundational concepts of deep learning and the Transformer architecture, focusing on neural networks, attention mechanisms, and their applications in sequence modeling tasks.
Explores chemical reaction prediction using generative models and molecular transformers, emphasizing the importance of molecular language processing and stereochemistry.
Explores Seq2Seq models with and without attention mechanisms, covering encoder-decoder architecture, context vectors, decoding processes, and different types of attention mechanisms.