Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Uri Stemmer is active.

Publication


Featured researches published by Uri Stemmer.


Theory of Computing | 2013

Private Learning and Sanitization: Pure vs. Approximate Differential Privacy

Amos Beimel; Kobbi Nissim; Uri Stemmer

We compare the sample complexity of private learning and sanitization tasks under pure e-differential privacy [Dwork, McSherry, Nissim, and Smith TCC 2006] and approximate (e,δ)-differential privacy [Dwork, Kenthapadi, McSherry, Mironov, and Naor EUROCRYPT 2006]. We show that the sample complexity of these tasks under approximate differential privacy can be significantly lower than that under pure differential privacy.


conference on innovations in theoretical computer science | 2013

Characterizing the sample complexity of private learners

Amos Beimel; Kobbi Nissim; Uri Stemmer

In 2008, Kasiviswanathan el al. defined private learning as a combination of PAC learning and differential privacy [16]. Informally, a private learner is applied to a collection of labeled individual information and outputs a hypothesis while preserving the privacy of each individual. Kasiviswanathan et al. gave a generic construction of private learners for (finite) concept classes, with sample complexity logarithmic in the size of the concept class. This sample complexity is higher than what is needed for non-private learners, hence leaving open the possibility that the sample complexity of private learning may be sometimes significantly higher than that of non-private learning. We give a combinatorial characterization of the sample size sufficient and necessary to privately learn a class of concepts. This characterization is analogous to the well known characterization of the sample complexity of non-private learning in terms of the VC dimension of the concept class. We introduce the notion of probabilistic representation of a concept class, and our new complexity measure RepDim corresponds to the size of the smallest probabilistic representation of the concept class. We show that any private learning algorithm for a concept class C with sample complexity m implies RepDim(C) = O(m), and that there exists a private learning algorithm with sample complexity m = O(RepDim(C)). We further demonstrate that a similar characterization holds for the database size needed for privately computing a large class of optimization problems and also for the well studied problem of private data release.


conference on innovations in theoretical computer science | 2016

Simultaneous Private Learning of Multiple Concepts

Mark Bun; Kobbi Nissim; Uri Stemmer

We investigate the {\em direct-sum} problem in the context of differentially private PAC learning: What is the sample complexity of solving k learning tasks simultaneously under differential privacy, and how does this cost compare to that of solving k learning tasks without privacy? In our setting, an individual example consists of a domain element x labeled by k unknown concepts (c1,...,ck). The goal of a multi-learner is to output k hypotheses (h1,...,hk) that generalize the input examples. Without concern for privacy, the sample complexity needed to simultaneously learn


symposium on the theory of computing | 2016

Algorithmic stability for adaptive data analysis

Raef Bassily; Kobbi Nissim; Adam D. Smith; Thomas Steinke; Uri Stemmer; Jonathan Ullman

k


foundations of computer science | 2015

Differentially Private Release and Learning of Threshold Functions

Mark Bun; Kobbi Nissim; Uri Stemmer; Salil P. Vadhan

concepts is essentially the same as needed for learning a single concept. Under differential privacy, the basic strategy of learning each hypothesis independently yields sample complexity that grows polynomially with k. For some concept classes, we give multi-learners that require fewer samples than the basic strategy. Unfortunately, however, we also give lower bounds showing that even for very simple concept classes, the sample cost of private multi-learning must grow polynomially in k.


neural information processing systems | 2017

Practical Locally Private Heavy Hitters

Raef Bassily; Kobbi Nissim; Uri Stemmer; Abhradeep Thakurta


arXiv: Learning | 2015

On the Generalization Properties of Differential Privacy

Kobbi Nissim; Uri Stemmer


international workshop and international workshop on approximation, randomization, and combinatorial optimization. algorithms and techniques | 2013

Private Learning and Sanitization: Pure vs. Approximate Differential Privacy.

Amos Beimel; Kobbi Nissim; Uri Stemmer


symposium on discrete algorithms | 2015

Learning privately with labeled and unlabeled examples

Amos Beimel; Kobbi Nissim; Uri Stemmer


symposium on principles of database systems | 2016

Locating a Small Cluster Privately

Kobbi Nissim; Uri Stemmer; Salil P. Vadhan

Collaboration


Dive into the Uri Stemmer's collaboration.

Top Co-Authors

Avatar

Kobbi Nissim

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

Amos Beimel

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Abhradeep Thakurta

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Adam D. Smith

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge