Newsroom
Optimization
-
FDS Colloquium: Lorenzo Orecchia (Chicago), “Variational Characterizations of First-Order Algorithms via Self-Duality”
Talk summary: First-order methods for convex optimization play an important role in the efficient deployment of machine learning algorithms. While a large number of […]
-
FDS Colloquium: George Lan (Georgia Tech), “Algorithmic Foundations of Risk-averse Optimization for Trustworthy AI”
Talk summary: Over the past two decades, stochastic optimization has made remarkable strides, driving its widespread adoption in machine learning (ML) and artificial intelligence (AI). […]
-
FDS Colloquium: Pravesh Kothari (Princeton), “The surprising reach of spectral algorithms for smoothed k-SAT”
Abstract: Semirandom input models are hybrids of the classical worst-case and average-case models in algorithm design. They were introduced in the 1990s to inspire […]
-
FDS Colloquium: Elliot Paquette (McGill), “High-dimensional Optimization with Applications to Compute-Optimal Neural Scaling Laws”
Abstract: Given the massive scale of modern ML models, we now only get a single shot to train them effectively. This restricts our ability to test multiple […]
-
FDS x Applied Physics Colloquium: Grant Rotskoff (Stanford), “Efficient variational inference with generative models”
Abstract: Neural networks continue to surprise us with their remarkable capabilities for high-dimensional function approximation. Applications of machine learning now pervade essentially every scientific […]
-
FDS Colloquium: Brice Huang (MIT), “Algorithmic thresholds in random optimization problems”
Abstract: Optimizing high-dimensional functions generated from random data is a central problem in modern statistics and machine learning. As these objectives are highly non-convex, […]
