Newsroom
Deep Learning
-
FDS Colloquium: Jinchao Xu (Kaust), “Finite Element versus Finite Neuron Methods”
Talk summary: This talk presents a unified framework connecting Barron and Sobolev spaces to analyze the approximation properties of ReLU$^k$ neural networks. It establishes […]
-
S&DS Seminar: Jingfeng Wu (Berkeley), “Gradient Descent Dominates Ridge: A Statistical View on Implicit Regularization”
Talk summary: A key puzzle in deep learning is how simple gradient methods find generalizable solutions without explicit regularization. This talk discusses the implicit […]
-
S&DS Seminar: Zhuoran Yang (Yale), “Unveiling In-Context Learning: Provable Training Dynamics and Feature Learning in Transformers”
Abstract: In-context learning (ICL) is a cornerstone of large language model (LLM) functionality, yet its theoretical foundations remain elusive due to the complexity of transformer […]
-
SDS Seminar: Blake Bordelon (Harvard), “Scaling Limits and Scaling Laws of Deep Learning”
Abstract: Scaling up the size and training horizon of deep learning models has enabled breakthroughs in computer vision and natural language processing. Empirical evidence suggests […]
