Artificial Neural Networks


FDS Colloquium: Elliot Paquette (McGill), “High-dimensional Optimization with Applications to Compute-Optimal Neural Scaling Laws”

Abstract: Given the massive scale of modern ML models, we now only get a single shot to train them effectively. This restricts our ability to test multiple […]


FDS Colloquium: Song Mei (Berkeley), “Revisiting neural network approximation theory in the age of generative AI”

Optional Zoom link: https://yale.zoom.us/j/97222935172 Abstract: Textbooks on deep learning theory primarily perceive neural networks as universal function approximators. While this classical viewpoint is fundamental, it […]


Workshop Honoring Andrew Barron: Forty Years at the Interplay of Information Theory, Probability and Statistical Learning (Day 3)

Workshop Honoring Andrew Barron: Forty Years at the Interplay of Information Theory, Probability and Statistical Learning (Day 3)

The Workshop Honoring Andrew Barron: “Forty Years at the Interplay of Information Theory, Probability and Statistical Learning” will take place from Friday, April 26 […]


Workshop Honoring Andrew Barron: Forty Years at the Interplay of Information Theory, Probability and Statistical Learning (Day 2)

Workshop Honoring Andrew Barron: Forty Years at the Interplay of Information Theory, Probability and Statistical Learning (Day 2)

The Workshop Honoring Andrew Barron: “Forty Years at the Interplay of Information Theory, Probability and Statistical Learning” will take place from Friday, April 26 […]


Workshop Honoring Andrew Barron: Forty Years at the Interplay of Information Theory, Probability and Statistical Learning

Workshop Honoring Andrew Barron: Forty Years at the Interplay of Information Theory, Probability and Statistical Learning

The Workshop Honoring Andrew Barron: “Forty Years at the Interplay of Information Theory, Probability and Statistical Learning” will take place from Friday, April 26 […]

Page 1 of 1