Wednesday, October 30, 2019 - 3:30pm to 4:30pm
- Broida 1640
Title: High-dimensional asymptotics of empirical risk minimization: A Gaussian processes approach
Classical estimation theory studies problems in which the number of unknown parameters is small compared to the number of observations. In contrast, modern inference problems are typically high-dimensional. Examples are abundant in signal processing and machine learning applications. Unfortunately, classical tools and theories are not applicable in this modern regime.
In this talk, I describe a framework to sharply characterize the statistical performance of popular convex empirical-risk-minimization methods in high-dimensions. I will demonstrate that, albeit challenging, sharp results are advantageous over loose order-wise bounds. Not only do they allow for accurate comparisons between different choices of the optimization parameters, but they also form the basis for establishing optimal such choices as
well as fundamental performance limitations.
To demonstrate the generality of the above framework, I will discuss its applications to sparse linear regression, one-bit compressed sensing, and binary classification. At the core of the framework, lies a new theorem on Gaussian process comparisons, which I will also highlight.
Christos Thrampoulidis is an Assistant Professor in the ECE Department at UC Santa Barbara. His research interests include statistical signal processing, optimization, and, machine learning. Before joining UCSB, Dr. Thrampoulidis was a Postdoctoral Researcher at MIT. He received his M.Sc. and Ph.D. degrees in Electrical Engineering from Caltech in 2012 and 2016, respectively, and his Diploma of electrical and computer engineering from the University of Patras in Greece in 2011. He is a recipient of the 2014 Qualcomm Innovation Fellowship, and, of the 2011 Andreas Mentzelopoulos Scholarship.
October 9, 2019 - 9:52am