BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:America/New_York
X-WR-TIMEZONE:America/New_York
BEGIN:VEVENT
UID:515@fds.yale.edu
DTSTART;TZID=America/New_York:20240327T113000
DTEND;TZID=America/New_York:20240327T130000
DTSTAMP:20250916T142131Z
URL:https://fds.yale.edu/events/fds-colloquium-aaditya-ramdas-cmu/
SUMMARY:FDS Colloquium: Aaditya Ramdas (CMU)\, "The numeraire e-variable an
 d reverse information projection"
DESCRIPTION:\n\n\n\nSpeaker: Aaditya RamdasAssistant Professor\, Department
  of Statistics & Data Science (75%)\,Machine Learning Department (25%)\,Ca
 rnegie Mellon UniversityWednesday\, March 27\, 2024Lunch:  11:30 am (Kitch
 en)Talk: 12:00 pm (Seminar Room #1327)at the Yale institute for Foundation
 s of Data Science\, Kline Tower\, 13th Floor\n\n\nTitle: The numeraire e-v
 ariable and reverse information projection\n\n\nAbstract: In an excellent 
 1999 Yale PhD thesis\, Jonathan Li proposed and defined a critical concept
  that he called the reverse information projection (RIPr)\, which is akin 
 to a KL projection of a distribution onto a set of probability measures. T
 his concept has gained prominence recently in game-theoretic statistics an
 d sequential testing by betting\, because it characterizes the log-optimal
  bet/e-variable of a point alternative hypothesis against a composite null
  hypothesis. However\, it required assumptions of convexity of the set of 
 distributions and a common reference measure to define densities. In this 
 talk\, we will show how to fully and completely generalize the theory unde
 rlying the RIPr\, showing that it is always well defined\, without any ass
 umptions on the distributions involved. Further\, a strong duality result 
 identifies it as the dual to an optimal bet/e-variable called the numerair
 e\, which is unique and also always exists without assumptions. This fully
  generalizes Kelly betting to composite nulls\, and also results by Grunwa
 ld and coauthors on safe testing. The talk will not assume any prior knowl
 edge on these topics. \n\nThis is joint work with Martin Larsson and Johan
 nes Ruf (https://arxiv.org/abs/2402.18810).\n\nBio: Aaditya Ramdas (PhD\, 
 2015) is an assistant professor at Carnegie Mellon University\, in the Dep
 artments of Statistics and Machine Learning. His research interests includ
 e game-theoretic statistics and sequential anytime-valid inference\, multi
 ple testing and post-selection inference\, and uncertainty quantification 
 for machine learning (conformal prediction\, calibration). His applied are
 as of interest include neuroscience\, genetics and auditing (real-estate\,
  finance\, elections). Aaditya received the IMS Peter Gavin Hall Early Car
 eer Prize\, the COPSS Emerging Leader Award\, the Bernoulli New Researcher
  Award\, the NSF CAREER Award\, the Sloan fellowship in Mathematics\, and 
 faculty research awards from Adobe and Google. He also spends 20% of his t
 ime at Amazon working on causality and sequential experimentation.\nWebsit
 e: https://www.stat.cmu.edu/~aramdas/\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\
 n\n\n\n\n\n\n\n\n\n\n
ATTACH;FMTTYPE=image/jpeg:https://fds.yale.edu/wp-content/uploads/2024/02/
 GUEST-Aaditya-Ramdas-5d84da4265faf3e3-276x300.jpg
CATEGORIES:FDS Events,Colloquium,Seminar Series
END:VEVENT
BEGIN:VTIMEZONE
TZID:America/New_York
X-LIC-LOCATION:America/New_York
BEGIN:DAYLIGHT
DTSTART:20240310T030000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
END:DAYLIGHT
END:VTIMEZONE
END:VCALENDAR