Explores high-performance OPF solvers, addressing challenges in power system optimization and showcasing significant speed-ups and memory-efficient approaches.
Explores the evolution of hardware/software co-design, emphasizing the importance of specialization and the challenges of optimizing performance and energy efficiency.
Covers subquadratic attention mechanisms and state space models, focusing on their theoretical foundations and practical implementations in machine learning.
Covers transformer architecture and subquadratic attention mechanisms, focusing on efficient approximations and their applications in machine learning.
Introduces Dynamic Programming, focusing on saving computation by remembering previous calculations and applying it to solve optimization problems efficiently.