Newsroom
Information Theory
-
FDS Colloquium: Jun’ichi Takeuchi (Kyushu), “Fisher information and Neural Tangent Kernels”
Abstract: We argue relation between neural tangent kernels (NTK) and Fisher information matrices of neural networks. For the Fisher information matrices of two layer […]
-
S&DS Seminar: Aaditya Ramdas (CMU), “Bringing closure to FDR control: a general principle for multiple testing”
Abstract: Since the publication of the seminal Benjamini-Hochberg paper (the most cited paper in statistics), it has been an open problem how the “closure […]
-
S&DS Seminar: Jingfeng Wu (Berkeley), “Gradient Descent Dominates Ridge: A Statistical View on Implicit Regularization”
Talk summary: A key puzzle in deep learning is how simple gradient methods find generalizable solutions without explicit regularization. This talk discusses the implicit […]
-
S&DS Seminar: Adam Smith (BU), “Privacy in Machine Learning and Statistical Inference”
Zoom Link: https://yale.zoom.us/j/94223816617 Meeting ID: 942 2381 6617 Abstract: The results of learning and statistical inference reveal information about the data they use. This […]
-
FDS Colloquium: Song Mei (Berkeley), “Revisiting neural network approximation theory in the age of generative AI”
Optional Zoom link: https://yale.zoom.us/j/97222935172 Abstract: Textbooks on deep learning theory primarily perceive neural networks as universal function approximators. While this classical viewpoint is fundamental, it […]
-
FDS Colloquium: Bento Natura (Columbia), “Faster Exact Linear Programming”
Optional Zoom link: https://yale.zoom.us/j/99342713421 Abstract: We present a novel algorithm to solve various subclasses of linear programs, with a particular focus on strongly polynomial […]
