Mason Lab 211 with remote access option, 9 Hillhouse Avenue, New Haven, CT 06520
Towards the Statistically Principled Design of ML Algorithms
Abstract: What are the optimal algorithms for learning from data? Have we found them already, or are better ones out there to be discovered? Making these questions precise, and answering them, requires taking on the mathematically deep interplay between statistical and computational considerations. It also requires reconciling our theoretical toolbox with surprising new phenomena arising from practice, which seem to violate conventional rules of thumb regarding algorithm and model design. I will discuss progress along these lines: in terms of designing new algorithms for basic learning problems, controlling generalization in large statistical models, and understanding statistical questions arising from generative modeling.
Speaker Bio: I am currently at Stanford University as a Motwani Postdoctoral Fellow. Right before, I was a research fellow in UC Berkeley’s Simons Institute in the Program on Computational Complexity of Statistical Inference. I received my PHD in Mathematics and Statistics from MIT, where I was coadvised by Ankur Moitra and Elchanan Mossel, and before that I received my undergraduate degree in Mathematics at Princeton University. My current research interests include computational learning theory and related topics: probability theory, high-dimensional statistics, optimization, related aspects of statistical physics, etc. In particular, I am very interested in learning and inference in graphical models.
Wednesday, February 01, 2023
3:30pm – Pre-talk meet and greet teatime – Dana House, 24 Hillhouse Ave.
4:00PM to 5:00PM – Talk – Mason Lab 211, 9 Hillhouse Ave, New Haven, CT