Events
FDS Postdoctoral Applicants
Re-inventing Machine Learning for Multiple Distributions: Optimization, Privacy, and Incentives
Speaker: Kumar Kshitij Patel (TTIC) Toyota Technological Institute at Chicago Monday, December 16, 2024 1:00PM - 2:00PM and via Webcast: https://yale.zoom.us/j/99456973798 |
“Re-inventing Machine Learning for Multiple Distributions: Optimization, Privacy, and Incentives”
Abstract: Federated Learning (FL) has emerged as a transformative framework for multi-distribution learning, driving breakthroughs in healthcare, research, finance, and consumer technologies. FL enables agents to train models on private data without sharing raw information, offering a basic yet crucial step toward safeguarding privacy while complying with regulations like GDPR. However, the practical deployment of FL poses significant challenges, including data heterogeneity, high communication costs, partial client participation, privacy concerns, and strategic behavior. In this talk, I will present my research addressing these challenges and bridging the gap between the theory and practice of FL and, more broadly, multi-distribution learning.
In the first part of the talk, I will examine the optimization challenges in FL. Specifically, I will outline conditions under which local update algorithms—the most widely used optimization methods in FL—outperform traditional distributed optimization approaches. By formalizing notions of data heterogeneity, I provide practical insights into selecting the appropriate algorithm for real-world applications. I will then explore how personalization mitigates heterogeneity’s adverse effects and discuss its application to private image generation using personalized diffusion models. I will also briefly touch on my work in the online setting and how performative effects can further complicate optimization.
The second part of the talk will focus on incentive design and fairness in collaborative learning. I will discuss two natural approaches to fairness for multi-distribution learning and illustrate their implications using my own work. I will also address client defections, a critical issue undermining FL’s effectiveness, and introduce a novel algorithm that prevents defections by ensuring sustainable collaboration. Finally, I will outline my vision for future research on data markets and economic frameworks that incentivize equitable and efficient data sharing.
Bio: Kumar Kshitij Patel is a PhD candidate at the Toyota Technological Institute at Chicago (TTIC), where he is advised by Professors Nathan Srebro and Lingxiao Wang. His research focuses on federated learning, privacy, optimization, and fairness, with an emphasis on addressing challenges related to data heterogeneity and decentralized collaboration. Kumar Kshitij’s work has been recognized with the Distinguished Paper Award at IJCAI 2024 and the Best Paper Honorable Mention Award at the ICML 2023 Federated Learning Workshop. In addition to his research, he has organized workshops and tutorials at venues such as UAI and TTIC, fostering interdisciplinary collaborations in machine learning and data science. Before joining TTIC, he earned a BTech in Computer Science from the Indian Institute of Technology (IIT) Kanpur, where he received the Honda YES Award in 2018.
Add To: Google Calendar | Outlook | iCal File
Submit an Event
Interested in creating your own event, or have an event to share? Please fill the form if you’d like to send us an event you’d like to have added to the calendar.