Back to Upcoming EventsThis Event has Passed

FDS Colloquium: Nathan Srebro (TTIC) “Interpolation Learning and Overfitting with Linear Predictors and Short Programs”

Wednesday, March 29, 2023    
4:00PM – 5:00PM

“Interpolation Learning and Overfitting with Linear Predictors and Short Programs”

Location: Mason 211 or remote access: https://yale.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=7e9e0891-7848-44ad-91e7-af93011fd580

Speaker: Nathan Srebro
Professor, Toyota Technological Institute at Chicago

Abstract: Classical theory, conventional wisdom, and all textbooks, tell us to avoid reaching zero training error and overfitting the noise, and instead balance model fit and complexity.  Yet, recent empirical and theoretical results suggest that in many cases overfitting is benign, and even interpolating the training data can lead to good generalization.  Can we characterize and understand when overfitting is indeed benign, and when it is catastrophic as classic theory suggests?  And can existing theoretical approaches be used to study and explain benign overfitting and the “double descent” curve?  I will discuss interpolation learning in linear (and kernel) methods, as well as using the universal “minimum description length” or “shortest program” learning rule.

Bio: Nati (Nathan) Srebro is a professor at the Toyota Technological Institute at Chicago, with cross-appointments at the University of Chicago’s Department of Computer Science, and Committee on Computational and Applied Mathematics. He obtained his PhD from the Massachusetts Institute of Technology in 2004, and previously was a postdoctoral fellow at the University of Toronto, a visiting scientist at IBM, and an associate professor at the Technion.  

Dr. Srebro’s research encompasses methodological, statistical and computational aspects of machine learning, as well as related problems in optimization. Some of Srebro’s significant contributions include work on learning “wider” Markov networks, introducing the use of the nuclear norm for machine learning and matrix reconstruction, work on fast optimization techniques for machine learning, and on the relationship between learning and optimization. His current interests include understanding deep learning through a detailed understanding of optimization, distributed and federated learning, algorithmic fairness and practical adaptive data analysis.


Submit an Event

Interested in creating your own event, or have an event to share? Please fill the form if you’d like to send us an event you’d like to have added to the calendar.

Submit an Event

Share your event ideas with us using the form below.

"*" indicates required fields

MM slash DD slash YYYY
Start Time*
:
End Time*
: