Speaker

Xiaolin Huang

Time

2024.04.03 16:00-17:30

Abstract

Generalization concerns the performance on new data, which is a core theoretical problem in machine learning. With the advent of deep models, the generalization analysis has undergone fundamental changes, becoming one of the key issues in deep learning theory. This lecture will review the development of classical machine learning generalization theory, explain the significant challenges brought by deep learning, and introduce the relevant progress of our research group: i) functional space expansion, ii) dynamic low-dimensional subspaces, iii) sharpness-aware minimization. The future development of theoretical machine learning will also be discussed. 

Bio

Xiaolin Huang, Professor in School of Electronics, Information and Electrical Engineering, Shanghai Jiao Tong University. He obtains his B.S. and Ph.D. from Xi‘an Jiao Tong University and Tsinghua University, respectively. After that he worked as Postdoc in KU Leuven and then as an Alexander von Humboldt Fellow in University of Erlangen-Nuremberg. He joined Shanghai Jiao Tong University in 2016 and became a full professor in 2024. 

His research interests in generalization analysis towards deep learning, especially on indefinite learning and low dimensional structures in training dynamics. His contribution leads to more than 10 papers in the top machine learning journals, namely JMLR and IEEE TPAMI. He also has a review on piecewise linear neural networks on Nature Reviews. He is now serving as an Action Editor for Machine Learning, an Area Chair for ICCV, and Senior PC member for AAAI. Besides theoretical research, he has been also working together with Huawei and Medtronic for industrial applications.