Murat Erdogdu (Statistics, Stanford University)

Event Date: 

Friday, January 13, 2017 - 3:30pm to 5:00pm

Event Date Details: 

Refreshments served at 3:15pm.

Event Location: 

  • Sobel Seminar Room; South Hall 5607F
  • Department Seminar Series

Abstract: Statistics and optimization have been closely linked since the very outset. This connection has become more essential lately, mainly because of the recent advances in computational resources, the availability of large amount of data, and the consequent growing interest in statistical and machine learning algorithms. In this talk, I will discuss how one can use tools from statistics such as Stein's lemma and subsampling to design scalable, efficient, and reliable optimization algorithms. The focus will be on large-scale problems where the iterative minimization of the empirical risk is computationally intractable, i.e., the number of observations n is much larger than the dimension of the parameter p, n >> p >> 1. The proposed algorithms have wide applicability to many supervised learning problems such as binary classification with smooth surrogate losses, generalized linear problems in their canonical representation, and M-estimators. The algorithms rely on iterations that are constructed by Stein's lemma, that achieve quadratic convergence rate, and that are cheaper than any batch optimization method by at least a factor of O(p). I will discuss theoretical guarantees of the proposed algorithms, along with their convergence behavior in terms of data dimensions. Finally, I will demonstrate their performance on well-known classification and regression problems, through extensive numerical studies on large-scale real datasets, and show that they achieve the highest performance compared to other widely used and specialized algorithms.