BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//6.4.8//EN
TZID:America/New_York
X-WR-TIMEZONE:America/New_York
BEGIN:VEVENT
UID:430@fds.yale.edu
DTSTART;TZID=America/New_York:20230112T150000
DTEND;TZID=America/New_York:20230112T160000
DTSTAMP:20240226T190411Z
URL:https://fds.yale.edu/events/fds-seminar-arnab-auddy-columbia/
SUMMARY:FDS Seminar: Arnab Auddy (Columbia)
DESCRIPTION:"Statistical Benefits and Computational Challenges of Tensor Sp
ectral Learning"\n\n\n\nTalk Abstract:Given multivariate observations from
a statistical model\, tensors are a natural way of recording higher order
interactions among variables. Tensor spectral learning is a collection of
methods wherein we aim to decompose a tensor into its components\, each o
f which correspond to interpretable features of the model. This approach h
as recently received a lot of attention for its application to latent vari
able models. In this talk\, I will focus on orthogonally decomposable tens
ors\, which arise naturally in many problems. These tensors have a decompo
sition that can be interpreted very similarly to matrix SVD\, but automati
cally provides much better identifiability properties than their matrix co
unterparts. I will show that in such a tensor decomposition\, a small pert
urbation affects each singular vector in isolation\, and their estimatibil
ity does not depend on the gap between consecutive singular values. In con
trast to these attractive statistical properties\, in general\, tensor met
hods present us with intriguing computational considerations. I will illus
trate these phenomena in the particular application to a spiked tensor PCA
problem and in Independent Component Analysis (ICA). Interestingly there
is a gap within the information theoretic and computationally tractable li
mits of both problems. Above the computational threshold\, we provide nois
e robust algorithms based on spectral truncation\, which provide rate opti
mal estimators. Our estimators are also asymptotically normal thus allowin
g confidence interval construction. Finally I will present some examples d
emonstrating our theoretical findings.\n\n\n\nThis talk was held virtually
on January 12\, 2023 @ 3:00 pm\n
CATEGORIES:Postdoctoral Applicants
LOCATION:Webcast\, \,
X-APPLE-STRUCTURED-LOCATION;VALUE=URI;X-ADDRESS=\, ;X-APPLE-RADIUS=100;X-TI
TLE=Webcast:geo:0,0
END:VEVENT
BEGIN:VTIMEZONE
TZID:America/New_York
X-LIC-LOCATION:America/New_York
BEGIN:STANDARD
DTSTART:20221106T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
END:STANDARD
END:VTIMEZONE
END:VCALENDAR