Yale's Institute for Foundations of Data Science (FDS) also known as the Kline Tower Institute, was created to advance research in the mathematical, algorithmic, and statistical foundations of data science and their application to other disciplines. The institute integrates faculty from departments and schools across the university to help scholars apply new methods of data science to their research. In turn, exposure to the unmet needs of domain scientists inspires advances in foundational research.
The institute will hold its activities at varying venues around campus. In August 2023, FDS will move into Kline Tower and provide a venue for data scientists to meet, learn about recent advances, discover exciting problems, host visitors, and hold seminars, workshops, and conferences.
Institute for Foundations of Data Science
10 Hillhouse Avenue
New Haven, CT
Most historical National Football League (NFL) analysis, both mainstream and academic, has relied on play-by-play data to generate team and player-level trends. Given the number of outside variables that impact on-field results, such as play call and game situation, findings are often no more than interesting anecdotes. With the release of player tracking data, however, analysts can appropriately ask and answer questions that better isolate player skill and coaching strategy. In this talk, we highlight the limitations of traditional analyses, and use a decades-old punching bag for analysts – fourth-down strategy – as a microcosm for why tracking data is needed.
10 Hillhouse Avenue
New Haven, CT
Spiking Neural Networks (SNNs) have recently emerged as an alternative to deep learning due to their huge energy efficiency benefits on neuromorphic hardware. In this presentation, I will talk about important techniques for training SNNs which bring a huge benefit in terms of latency, accuracy, interpretability, and robustness. We will first delve into how training is performed in SNNs. Training SNNs with surrogate gradients presents computational benefits due to short latency. However, due to the non-differentiable nature of spiking neurons, the training becomes problematic and surrogate methods have thus been limited to shallow networks. To address this training issue with surrogate gradients, we will go over a recently proposed method Batch Normalization Through Time (BNTT) that allows us to train SNNs from scratch with very low latency and enables us to target interesting applications like video segmentation and beyond traditional learning scenarios, like federated training. Another critical limitation of SNNs is the lack of interpretability. While a considerable amount of attention has been given to optimizing SNNs, the development of explainability still is at its infancy. I will talk about our recent work on a bio-plausible visualization tool for SNNs, called Spike Activation Map (SAM) compatible with BNTT training. The proposed SAM highlights spikes having short inter-spike interval, containing discriminative information for classification. Finally, with proposed BNTT and SAM, I will highlight the robustness aspect of SNNs with respect to adversarial attacks. In the end, I will talk about interesting prospects of SNNs for non-conventional learning scenarios such as privacy-preserving distributed learning as well as unraveling the temporal correlation in SNNs with feedback connections. Finally, time permitting, I will talk about the prospects of SNNs for novel and emerging compute-in-memory hardware that can potentially yield order of magnitude lower power consumption than conventional CPUs/GPUs.
Past Programs & Events
- Multiscale Diffusions, Earth Movers Distances, Flows - Smita Krishnaswamy, 12 October 2022➔
- Graph Representation Learning: A Geometric Perspective, Rex Ying, 5 October 2022➔
- Balancing covariates in randomized experiments, Daniel. A Spielman, 28 September 2022➔
- Gaming the Learning, 21 September 2022. Amin Karbasi➔
- Institute for Foundations of Data Science debuts with interdisciplinary vision ➔
- Dan Spielman wins the 2023 breakthrough prize in mathematics ➔
- One step closer to creating new hair follicles ➔
- Blockchain not just for bitcoin: It can secure and store genmoes, too ➔
- Dragomir Radev Receives the 2022 ACL Distinguished Service Award ➔
- Explosive growth of faculty, courses and research signal new era for Computer Science for Yale ➔
- Priya Panda Receives Google's Research Scholar Program Award ➔
- Data and politics: Kalla on persuasion, prejudice, and decision making ➔
To sign up to receive announcements of future talks in this series and other events being run by FDS
Apply for a Postdoctoral Position
Yale’s Institute for Foundations of Data Science (FDS) is seeking applications for postdoctoral positions in Data Science. These will be generously supported postdoctoral positions, expected to last 2-3 years, for independent scholars working on the foundations of data science. FDS postdocs can select multiple mentors from among the members of the institute, and can change their mentors during their fellowship. This is an opportunity to work with leading theorists as well as domain scientists who are eager to collaborate. A list of the members may be found on this site. Yale’s Data Science Initiative has supported the rapid growth of the departments of Statistics & Data Science and Computer Science, as well as many interdisciplinary activities in which the postdocs could participate, including a Data Intensive Social Science Center, a center for Biomedical Data Science, and the Schmidt Program on Artificial Intelligence, Emerging Technologies, and National Power.
Apply to be a Member
Members of FDS come from departments and schools across the university, united by a research interest in the foundations of data science. Member applicants with appointments in FAS or SEAS should be ladder faculty, hold teaching positions, or be research scientists. Members from other schools should have appointments that allow graduate advising. Reasonable exceptions to this policy are possible. We will review applications submitted by any member of the Yale faculty, but note we have a phased approach to membership to grow our numbers in line with our capacity. To ensure our ability to scale appropriately, we hope you understand if we defer a decision on your membership to a later date.