BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:America/New_York
X-WR-TIMEZONE:America/New_York
BEGIN:VEVENT
UID:822@fds.yale.edu
DTSTART;TZID=America/New_York:20241216T130000
DTEND;TZID=America/New_York:20241216T140000
DTSTAMP:20250916T142145Z
URL:https://fds.yale.edu/events/fds-seminar-kumar-kshitij-patel-ttic/
SUMMARY:FDS Seminar: Kumar Kshitij Patel (TTIC)
DESCRIPTION:\n"Re-inventing Machine Learning for Multiple Distributions: O
 ptimization\, Privacy\, and Incentives"\n\n\n\nAbstract: Federated Learni
 ng (FL) has emerged as a transformative framework for multi-distribution l
 earning\, driving breakthroughs in healthcare\, research\, finance\, and c
 onsumer technologies. FL enables agents to train models on private data wi
 thout sharing raw information\, offering a basic yet crucial step toward s
 afeguarding privacy while complying with regulations like GDPR. However\, 
 the practical deployment of FL poses significant challenges\, including da
 ta heterogeneity\, high communication costs\, partial client participation
 \, privacy concerns\, and strategic behavior. In this talk\, I will presen
 t my research addressing these challenges and bridging the gap between the
  theory and practice of FL and\, more broadly\, multi-distribution learnin
 g.\n\n\n\nIn the first part of the talk\, I will examine the optimization 
 challenges in FL. Specifically\, I will outline conditions under which loc
 al update algorithms—the most widely used optimization methods in FL—o
 utperform traditional distributed optimization approaches. By formalizing 
 notions of data heterogeneity\, I provide practical insights into selectin
 g the appropriate algorithm for real-world applications. I will then explo
 re how personalization mitigates heterogeneity's adverse effects and discu
 ss its application to private image generation using personalized diffusio
 n models. I will also briefly touch on my work in the online setting and h
 ow performative effects can further complicate optimization.\n\n\n\nThe se
 cond part of the talk will focus on incentive design and fairness in colla
 borative learning. I will discuss two natural approaches to fairness for m
 ulti-distribution learning and illustrate their implications using my own 
 work. I will also address client defections\, a critical issue undermining
  FL's effectiveness\, and introduce a novel algorithm that prevents defect
 ions by ensuring sustainable collaboration. Finally\, I will outline my vi
 sion for future research on data markets and economic frameworks that ince
 ntivize equitable and efficient data sharing.\n\n\n\nBio:&nbsp\;Kumar Kshi
 tij Patel is a PhD candidate at the Toyota Technological Institute at Chic
 ago (TTIC)\, where he is advised by Professors Nathan Srebro and Lingxiao 
 Wang. His research focuses on federated learning\, privacy\, optimization\
 , and fairness\, with an emphasis on addressing challenges related to data
  heterogeneity and decentralized collaboration. Kumar Kshitij’s work has
  been recognized with the Distinguished Paper Award at IJCAI 2024 and the 
 Best Paper Honorable Mention Award at the ICML 2023 Federated Learning Wor
 kshop. In addition to his research\, he has organized workshops and tutori
 als at venues such as UAI and TTIC\, fostering interdisciplinary collabora
 tions in machine learning and data science. Before joining TTIC\, he earne
 d a BTech in Computer Science from the Indian Institute of Technology (IIT
 ) Kanpur\, where he received the Honda YES Award in 2018.&nbsp\;\n
CATEGORIES:FDS Events,Postdoctoral Applicants
LOCATION:https://yale.zoom.us/j/99456973798
END:VEVENT
BEGIN:VTIMEZONE
TZID:America/New_York
X-LIC-LOCATION:America/New_York
BEGIN:STANDARD
DTSTART:20241103T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
END:STANDARD
END:VTIMEZONE
END:VCALENDAR