BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Yale Institute for Foundations of Data Science//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
BEGIN:VEVENT
UID:1013-0
SUMMARY:S&DS Seminar: Ilias Zadik (MIT)
DTSTART:20230223T153000Z
DTEND:20230223T163000Z
DTSTAMP:20230203T051529Z
LAST-MODIFIED:20230223T195954Z
SEQUENCE:0
LOCATION:DL220, 10 Hillhouse Ave, 2nd Floor, New Haven, CT 06511
DESCRIPTION:\nSpeaker: Ilias Zadik\, MIT\n\n\n\nIn-Person seminars will be held at Dunham Lab Room 220 with optional remote access:(https://yale.hosted.panopto.com/Panopto/Pages/Sessions/List.aspx?folderID=f8b73c34-a27b-42a7-a073-af2d00f90ffa)\n\n\n\nThe price of computational efficiency in high-dimensional estimation\n\n\n\nAbstract: In recent years we have experienced a remarkable growth on the number and size of available datasets. Such growth has led to the intense and challenging pursuit of estimators which are provably both computationally efficient and statistically accurate. Notably\, the analysis of polynomial-time estimators has revealed intriguing phenomena in several high dimensional estimation tasks\, such as their apparent failure of such estimators to reach the optimal statistical guarantees achieved among all estimators (that is the presence of a non-trivial “computational-statistical trade-off”).\n\n\n\nIn this talk\, I will present new such algorithmic results for the well-studied planted clique model and for the fundamental sparse regression model. For planted clique\, we reveal the surprising severe failure of the Metropolis process to work in polynomial-time\, even when simple degree heuristics succeed. In particular\, our result resolved a well-known 30-years old open problem on the performance of the Metropolis process for the model\, posed by Jerrum in 1992. For sparse regression\, we show the failure of large families of polynomial-time estimators\, such as MCMC and low-degree polynomial methods\, to improve upon the best-known polynomial-time regression methods. As an outcome\, our work offers rigorous evidence that popular regression methods such as LASSO are optimally balancing their computational and statistical recourses.\n\n\n\nBio: My research lies broadly in the interface of high dimensional statistics\, the theory of machine learning and computation\, and applied probability. A lot of my work has the goal to build and use mathematical tools to bring insights into the computational and statistical challenges of modern machine learning tasks. Website: https://iliaszadik.github.io/\n\n\n\n\n\nThursday\, February 23\, 2023\n\n\n\n10:30 am - 11:30 am - Talk - Dunham Lab\, Room 220\, 10 Hillhouse Avenue\, 2nd Floor with the option of virtual participation\n\n\n\n\nwatch\n\n\n\n
END:VEVENT
END:VCALENDAR