Events
Colloquium
Scaling Neural Networks: Laws and Limits
|
Speaker: Cengiz Pehlavan (Harvard) Assistant Professor of Applied Mathematics Harvard University Wednesday, March 4, 2026 11:30AM - 1:00PM Lunch at 11:30am in 1307
Talk 12:00-1:00pm in 1327 Location: Yale Institute for Foundations of Data Science & Webcast, 219 Prospect Street, 13th Floor, New Haven, CT 06511 and via Webcast: https://yale.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=cd4449de-8508-4184-aa35-b3ca01484881 |
Abstract: Scaling up neural network models has enabled unprecedented capabilities through learning, but we still lack the first-principles understanding needed to ensure their safety, reliability, and efficiency. I will show how tools from statistical mechanics and random matrix theory allow us to analyze neural networks in appropriate infinite-size scaling limits, thereby mapping the learning regimes that govern observed scaling behavior. These results account for the main features of empirical neural scaling laws; enable transfer of near-optimal hyperparameters across model sizes, yielding significant computational benefits; and provide a framework for understanding emergent behaviors such as in-context learning.
Speaker Bio: Cengiz (pronounced “Jen·ghiz”) comes to Harvard SEAS from the Flatiron Institute’s Center for Computational Biology (CCB), where he was a a research scientist in the neuroscience group. Before CCB, Cengiz was a postdoctoral associate at Janelia Research Campus, and before that a Swartz Fellow at Harvard.
Cengiz received a doctorate in physics from Brown University and undergraduate degrees in physics and electrical engineering from Bogazici University. He is a native of Tosya, Turkey.
Add To: Google Calendar | Outlook | iCal File
- Colloquium
Submit an Event
Interested in creating your own event, or have an event to share? Please fill the form if you’d like to send us an event you’d like to have added to the calendar.
