Back to Upcoming Events

FDS Colloquium: Sofya Raskhodnikova (Boston), “Privately evaluating untrusted black-box functions”

Wednesday, September 11, 2024    
11:30AM – 1:00PM
Yale Institute for Foundations of Data Science
Kline Tower 13th Floor, Room 1327
New Haven, CT 06511
Speaker Image

Speaker: Sofya Raskhodnikova

Professor of Computer Science, Boston University

Wednesday, September 11, 2024

Lunch at 11:30am in 1307; Talk at 12:00pm in 1327

Location: Yale Institute for Foundations of Data Science, Kline Tower 13th Floor, Room 1327, New Haven, CT 06511

Add To: Google Calendar | Outlook | iCal File

Abstract

We provide tools for sharing sensitive data in situations when the data curator does not know in advance what questions an (untrusted) analyst might want to ask about the data. The analyst can specify a program that they want the curator to run on the dataset. We model the program as a black-box function $f$. We study differentially private algorithms, called privacy wrappers, that, given black-box access to a real-valued function $f$ and a sensitive dataset $x$, output an accurate approximation to $f(x)$. The dataset $x$ is modeled as a finite subset of a possibly infinite universe, in which each entry $x$ represents the data of one individual. A privacy wrapper calls $f$ on the dataset $x$ and on some subsets of $x$ and returns either an approximation to $f(x)$ or a nonresponse symbol $\perp$. The wrapper may also use additional information (that is, parameters) provided by the analyst, but differential privacy is required for all values of these parameters. Correct setting of these parameters will ensure better accuracy of the privacy wrapper. The bottleneck in the running time of our privacy wrappers is the number of calls to $f$, which we refer to as queries. Our goal is to design privacy wrappers with high accuracy and small query complexity. 

We consider two settings: in the automated sensitivity detection setting, the analyst supplies only the black-box function $f$ and the intended (finite) range of $f$; in the provided sensitivity bound setting, the analyst also supplies additional parameters that describe the sensitivity of $f$. We define accuracy for both settings. We present the first privacy wrapper for the automated sensitivity detection setting. For the setting where a sensitivity bound is provided by the analyst, we design privacy wrappers with simultaneously optimal accuracy and query complexity, improving on the constructions provided (or implied) by previous work. We also prove tight lower bounds for both settings. In addition to addressing the black-box privacy problem, our private mechanisms provide feasibility results for the differentially private release of general classes of functions. 

Joint work with Ephraim Linder, Adam Smith, and Thomas Steinke

Bio: Sofya Raskhodnikova is a professor of Computer Science at Boston University. Her office is located in the newly constructed Center for Computing and Data Sciences, colloquially known as the Jenga building. She received all her degrees (S.B., S.M., and Ph.D.) from MIT, and then was a postdoctoral fellow at the Hebrew University of Jerusalem and the Weizmann Institute of Science. She was a Professor of Computer Science and Engineering at Penn State and held visiting positions at the Institute for Pure and Applied Mathematics at UCLA, Boston University, Harvard University, and at the Simons Institute for the Theory of Computing at Berkeley. Sofya works in the areas of randomized and approximation algorithms. Her main interest is the design and analysis of sublinear-time algorithms for combinatorial problems. She has also made important contributions to data privacy.

Submit an Event

Interested in creating your own event, or have an event to share? Please fill the form if you’d like to send us an event you’d like to have added to the calendar.

Submit an Event

Share your event ideas with us using the form below.

"*" indicates required fields

MM slash DD slash YYYY
Start Time*
:
End Time*
: