The Role of Explicit Regularization in Overparameterized Neural Networks
Shiyu Liang, University of Illinois at Urbana-Champaign
2021-03-01 10:00:00 ~ 2021-03-01 11:30:00
腾讯线上会议(会议ID：661 493 082, 会议密码：953849)
Overparameterized neural networks have proved to be remarkably successful in many complex tasks such as image classification and deep reinforcement learning. In this talk, we will consider the role of explicit regularization in training overparameterized neural networks. Specifically, we consider ReLU networks and show that the landscape of commonly used regularized loss functions have the property that every local minimum has good memorization and regularization performance.
Shiyu Liang is a sixth-year Ph.D. student at University of Illinois at Urbana-Champaign, advised by Professor R. Srikant. I am also pursuing an M.S. in Mathematics. Before joining UIUC, I graduated from Shanghai Jiao Tong University. My research interests include machine learning, optimization and applied probability.