报告题目:Eigencurve: Optimal Learning Rate Schedule for SGD on Quadratic Objectives with Skewed Hessian Spectrums
报告人:叶海山副教授 西安交通大学
邀请人:赵志华
报告时间:2023年6月13日(周二)下午3:00-5:20
报告地点:南校区网安大楼会议中心113会议室
报告人简介:叶海山,西安交通大学管理学院副教授,博士生导师,入选陕西省高层次人才引进计划青年项目。主要从事数学优化理论、算法和应用研究。在Mathematical Programming, Journal of Machine Learning Research, IEEE Transactions on Neural Networks and Learning Systems等国际期刊和会议上发表几十余篇学术论文。
报告摘要:Learning rate schedulers have been widely adopted in training deep neural networks. Despite their practical importance, there is a discrepancy between its practice and its theoretical analysis. For instance, it is not known what schedules of SGD achieve best convergence, even for simple problems such as optimizing quadratic objectives. In this paper, we propose Eigencurve, the first family of learning rate schedules that can achieve minimax optimal convergence rates (up to a constant) for SGD on quadratic objectives when the eigenvalue distribution of the underlying Hessian matrix is skewed. The condition is quite common in practice. Experimental results show that Eigencurve can significantly outperform step decay in image classification tasks on CIFAR-10, especially when the number of epochs is small. Moreover, the theory inspires two simple learning rate schedulers for practical applications that can approximate eigencurve.For some problems, the optimal shape of the proposed schedulers resembles that of cosine decay, which sheds light to the success of cosine decay for such situations. For other situations, the proposed schedulers are superior to cosine decay.