Explores the provable benefits of overparameterization in model compression, emphasizing the efficiency of deep neural networks and the importance of retraining for improved performance.
Covers Convolutional Neural Networks, including layers, training strategies, standard architectures, tasks like semantic segmentation, and deep learning tricks.
Presents an all-analog photoelectronic chip for high-speed vision tasks, addressing challenges in classical computation and proposing a hybrid optical-electrical framework.
Introduces a functional framework for deep neural networks with adaptive piecewise-linear splines, focusing on biomedical image reconstruction and the challenges of deep splines.
Introduces feed-forward networks, covering neural network structure, training, activation functions, and optimization, with applications in forecasting and finance.