Andrew R. Barron has been a full professor of Statistics (now Statistics and Data Science) since his arrival at Yale in 1992. Before that he held faculty positions in Statistics and in Electrical and Computer Engineering at the University of Illinois at Urbana-Champaign following his PhD from Stanford in 1985. He is known for his work in several areas: (1) at the overlap of information theory and statistics (for example, in characterization of minimax rates of convergence and in foundational work on properties of the mimimum description-length principle), (2) at the overlap of information theory and probability (for example, an entropic proof of the central limit theorem), (3) in artificial neural networks (introducting unifying principles of “Statistical Learning Networks” and providing the first approximation and estimation bounds), and (4) subsequent to demonstration of practical capacity achieving codes in various other settings, co-inventing and proving that Sparse Regression Codes are capacity achieving for communications channels with Gaussian noise. Current interest are in establishing mixing properties for Markov chains suitable for provably feasible and accurate training of deep nets.

What do you do with data science?

Much of Andrew’s work has been broadly within data science, including both its statistical and algorithmic aspects. This includes his work on determination of minimax rates of estimation, on properties of the minimum description-length principle, in artificial neural network approximation and estimation, and in statistical formulations of communication channel coding and decoding. Current interests explore the evolution of the distribution of parameters used in fitting deep nets and mixture models, by both deterministic and stochastic transition rules. The aim is to find rules of parameter evolution that are provably fast and have provably accurate generalization.