Back to Upcoming EventsThis Event has Passed

FDS Seminar: Priya Panda (Department of Electrical Engineering)”Exploring Robustness and Energy–Efficiency in Neural Systems with Spike–based Machine Intelligence”

Wednesday, November 9, 2022    
4:00PM – 6:00PM
Speaker Image

Speaker: Priya Panda

Assistant Professor of Electrical Engineering

Wednesday, November 9, 2022

Location: , , ,

Add To: Google Calendar | Outlook | iCal File

Abstract: Spiking Neural Networks (SNNs) have recently emerged as an alternative to deep learning due to their huge energy efficiency benefits on neuromorphic hardware. In this presentation, I will talk about important techniques for training SNNs which bring a huge benefit in terms of latency, accuracy, interpretability, and robustness. We will first delve into how training is performed in SNNs. Training SNNs with surrogate gradients presents computational benefits due to short latency. However, due to the non-differentiable nature of spiking neurons, the training becomes problematic and surrogate methods have thus been limited to shallow networks. To address this training issue with surrogate gradients, we will go over a recently proposed method Batch Normalization Through Time (BNTT) that allows us to train SNNs from scratch with very low latency and enables us to target interesting applications like video segmentation and beyond traditional learning scenarios, like federated training. Another critical limitation of SNNs is the lack of interpretability. While a considerable amount of attention has been given to optimizing SNNs, the development of explainability still is at its infancy. I will talk about our recent work on a bio-plausible visualization tool for SNNs, called Spike Activation Map (SAM) compatible with BNTT training. The proposed SAM highlights spikes having short inter-spike interval, containing discriminative information for classification. Finally, with proposed BNTT and SAM, I will highlight the robustness aspect of SNNs with respect to adversarial attacks. In the end, I will talk about interesting prospects of SNNs for non-conventional learning scenarios such as privacy-preserving distributed learning as well as unraveling the temporal correlation in SNNs with feedback connections. Finally, time permitting, I will talk about the prospects of SNNs for novel and emerging compute-in-memory hardware that can potentially yield order of magnitude lower power consumption than conventional CPUs/GPUs.

Bio: Priya’s research interests lie in Neuromorphic Computing: spanning energy-efficient design methodologies for deep learning networks, novel supervised/unsupervised learning algorithms for spiking neural networks and developing neural architectures for new computing scenarios (such as lifelong learning, generative models, stochastic networks, adversarial attacks etc.).

Her goal is to empower energy-aware and energy-efficient machine intelligence through algorithm-hardware co-design while being secure to adversarial scenarios and catering to the resource constraints of Internet of Things (IoT) devices.


Submit an Event

Interested in creating your own event, or have an event to share? Please fill the form if you’d like to send us an event you’d like to have added to the calendar.

Submit an Event

Share your event ideas with us using the form below.

"*" indicates required fields

MM slash DD slash YYYY
Start Time*
End Time*