Manolis Zampetakis

Assistant Professor of Computer Science

Manolis Zampetakis

Contact Information

Office

8572631079

Website

mzampet.com

Manolis Zampetakis is currently an Assistant Professor of Computer Science at Yale University. Before Yale he was a post-doctoral researcher at the EECS Department of UC Berkeley working with Michael Jordan. He received his PhD from the EECS Department at MIT where he was advised by Constantinos Daskalakis. He has been awarded the Google PhD Fellowship and the ACM SIGEcom Doctoral Dissertation Award. His research interests include: theoretical machine learning (ML), algorithmic game theory, statistics, optimization and complexity theory.

What do you do with Data Science?

My work on data science has four main components: data analysis from systematically biased data, understanding the convergence properties of popular heuristic methods, theoretical understanding of optimization methods used in machine learning and data science, and economic and strategic aspects that arise in data science environments. As an example, I have worked on statistical analysis in the presence of truncation bias and self-selection bias. Truncation occurs when samples falling outside a subset of the support of the population are not observed. Imagine using a telescope to take measurements of the sky. No matter how good the telescope is the received data will be truncated due to the bounded region of the measuring device. Such phenomena have a lot of manifestations in economics, social sciences, biological studies, and all areas of physical sciences, and dates back to famous statisticians like Pearson, Lee, and Fisher. In our work we are the first to develop estimation algorithms that are provably sample and computationally efficient in the presence of truncation bias and can be applied in a very broad class of problems. One of my goals in the future is to understand the connections and implications of this line of work with causal inference. Some of my works in this are are the following: Data Analysis from Biased Data - What Makes a Good Fisherman? Linear Regression under Self-Selection Bias with Yeshwanth Cherapanamjeri, Constantinos Daskalakis and Andrew Ilyas - A Statistical Taylor Theorem and Extrapolation of Truncated Densities with Constantinos Daskalakis, Vasilis Kontonis, and Christos Tzamos - Efficient Statistics, in High Dimensions, from Truncated Samples with Constantinos Daskalakis, Themis Gouleakis and Christos Tzamos Convergence of Popular Heuristic Methods - Estimation and Inference with Trees and Forests in High Dimensions with Vasilis Syrganis - Ten Steps of EM Suffice for Mixtures of Two Gaussians with Constantinos Daskalakis and Christos Tzamos Nonconvex Optimization - Deterministic Nonsmooth Nonconvex Optimization with Michael I. Jordan, Guy Kornowski, Ohad Shamir, and Tianyi Lin - The Computational Complexity of Finding Stationary Points in Non-Convex Optimization with Alexandros Hollender - The Complexity of Constrained Min-Max Optimization with Constantinos Daskalakis and Stratis Skoulakis

Edit profile