Shiva Prasad Kasiviswanathan
General Electric
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Shiva Prasad Kasiviswanathan.
SIAM Journal on Computing | 2011
Shiva Prasad Kasiviswanathan; Homin K. Lee; Kobbi Nissim; Sofya Raskhodnikova; Adam D. Smith
Learning problems form an important category of computational tasks that generalizes many of the computations researchers apply to large real-life data sets. We ask: what concept classes can be learned privately, namely, by an algorithm whose output does not depend too heavily on any one input or specific training example? More precisely, we investigate learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in the contexts where aggregate information is released about a database containing sensitive information about individuals. We present several basic results that demonstrate general feasibility of private learning and relate several models previously studied separately in the contexts of privacy and standard learning.
conference on information and knowledge management | 2011
Shiva Prasad Kasiviswanathan; Prem Melville; Arindam Banerjee; Vikas Sindhwani
Streaming user-generated content in the form of blogs, microblogs, forums, and multimedia sharing sites, provides a rich source of data from which invaluable information and insights maybe gleaned. Given the vast volume of such social media data being continually generated, one of the challenges is to automatically tease apart the emerging topics of discussion from the constant background chatter. Such emerging topics can be identified by the appearance of multiple posts on a unique subject matter, which is distinct from previous online discourse. We address the problem of identifying emerging topics through the use of dictionary learning. We propose a two stage approach respectively based on detection and clustering of novel user-generated content. We derive a scalable approach by using the alternating directions method to solve the resulting optimization problems. Empirical results show that our proposed approach is more effective than several baselines in detecting emerging topics in traditional news story and newsgroup data. We also demonstrate the practical application to social media analysis, based on a study on streaming data from Twitter.
algorithmic applications in management | 2007
Martin Fürer; Shiva Prasad Kasiviswanathan
An algorithm is presented for exactly counting the number of maximum weight satisfying assignments of a 2- Cnf formula. The worst case running time of O( 1.246n) for formulas with nvariables improves on the previous bound of O( 1.256n) by Dahllof, Jonsson, and Wahlstrom. The algorithm uses only polynomial space. As a direct consequence we get an O(1.246n) time algorithm for counting maximum weighted independent sets in a graph.
theory of cryptography conference | 2010
Amos Beimel; Shiva Prasad Kasiviswanathan; Kobbi Nissim
Learning is a task that generalizes many of the analyses that are applied to collections of data, and in particular, collections of sensitive individual information. Hence, it is natural to ask what can be learned while preserving individual privacy. [Kasiviswanathan, Lee, Nissim, Raskhodnikova, and Smith; FOCS 2008] initiated such a discussion. They formalized the notion of private learning, as a combination of PAC learning and differential privacy, and investigated what concept classes can be learned privately. Somewhat surprisingly, they showed that, ignoring time complexity, every PAC learning task could be performed privately with polynomially many samples, and in many natural cases this could even be done in polynomial time. While these results seem to equate non-private and private learning, there is still a significant gap: the sample complexity of (non-private) PAC learning is crisply characterized in terms of the VC-dimension of the concept class, whereas this relationship is lost in the constructions of private learners, which exhibit, generally, a higher sample complexity. Looking into this gap, we examine several private learning tasks and give tight bounds on their sample complexity. In particular, we show strong separations between sample complexities of proper and improper private learners (such separation does not exist for non-private learners), and between sample complexities of efficient and inefficient proper private learners. Our results show that VC-dimension is not the right measure for characterizing the sample complexity of proper private learning. We also examine the task of private data release (as initiated by [Blum, Ligett, and Roth; STOC 2008]), and give new lower bounds on the sample complexity. Our results show that the logarithmic dependence on size of the instance space is essential for private data release.
Parameterized and Exact Computation | 2009
Martin Fürer; Serge Gaspers; Shiva Prasad Kasiviswanathan
The bandwidth of a graph G on n vertices is the minimum b such that the vertices of G can be labeled from 1 to n such that the labels of every pair of adjacent vertices differ by at most b. In this paper, we present a 2-approximation algorithm for the Bandwidth problem that takes worst-case
very large data bases | 2015
Hao Huang; Shiva Prasad Kasiviswanathan
\mathcal{O}(1.9797^n)
workshop on algorithms and data structures | 2007
Piotr Berman; Shiva Prasad Kasiviswanathan
acm symposium on parallel algorithms and architectures | 2007
Piotr Berman; Jieun Jeong; Shiva Prasad Kasiviswanathan; Bhuvan Urgaonkar
= \mathcal{O}(3^{0.6217 n})
Theoretical Computer Science | 2013
Martin Fürer; Serge Gaspers; Shiva Prasad Kasiviswanathan
time and uses polynomial space. This improves both the previous best 2- and 3-approximation algorithms of Cygan et al. which have an
workshop on algorithms and data structures | 2007
Martin Fürer; Shiva Prasad Kasiviswanathan
\mathcal{O}^*(3^n)