Newsroom
Large-Scale Optimization
-
S&DS Seminar: Adam Block (Columbia), “Scaling Inference-Time Compute: From Self-Improvement to Pessimism”
Abstract: Language models increasingly rely on scaling inference-time computation to achieve state-of-the-art performance on a growing number of reasoning tasks. A popular paradigm for […]
-
FDS Colloquium: George Lan (Georgia Tech), “Algorithmic Foundations of Risk-averse Optimization for Trustworthy AI”
Talk summary: Over the past two decades, stochastic optimization has made remarkable strides, driving its widespread adoption in machine learning (ML) and artificial intelligence (AI). […]
-
FDS Colloquium: Elliot Paquette (McGill), “High-dimensional Optimization with Applications to Compute-Optimal Neural Scaling Laws”
Abstract: Given the massive scale of modern ML models, we now only get a single shot to train them effectively. This restricts our ability to test multiple […]
-
FDS Colloquium: Emma Zang (Yale), “Harnessing AI and Digital Data: Unlocking New Frontiers in Family Research”
Abstract: AI and digital data have been increasingly used to explore a wide range of social science topics, yet their potential in family research […]
-
S&DS Seminar: Florentina Bunea (Cornell), “Learning Large Softmax Mixtures with Warm Start EM”
Mixed multinomial logits are discrete mixtures introduced several decades ago to model the probability of choosing an attribute xj 2 RL from p possible candidates, […]
-
FDS Colloquium: Bento Natura (Columbia), “Faster Exact Linear Programming”
Optional Zoom link: https://yale.zoom.us/j/99342713421 Abstract: We present a novel algorithm to solve various subclasses of linear programs, with a particular focus on strongly polynomial […]
