David Eisenstat
Brown University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by David Eisenstat.
international symposium on distributed computing | 2007
Dana Angluin; James Aspnes; David Eisenstat
We describe and analyze a 3-state one-way population protocol for approximate majority in the model in which pairs of agents are drawn uniformly at random to interact. Given an initial configuration of xs, ys and blanks that contains at least one non-blank, the goal is for the agents to reach consensus on one of the values x or y. Additionally, the value chosen should be the majority non-blank initial value, provided it exceeds the minority by a sufficient margin. We prove that with high probability n agents reach consensus in O(n log n) interactions and the value chosen is the majority provided that its initial margin is at least ω(√n log n). This protocol has the additional property of tolerating Byzantine behavior in o(√n) of the agents, making it the first known population protocol that tolerates Byzantine agents. Turning to the register machine construction from [2], we apply the 3-state approximate majority protocol and other techniques to speed up the per-step parallel time overhead of the simulation from O(log4 n) to O(log2 n). To increase the robustness of the phase clock at the heart of the register machine, we describe a consensus version of the phase clock and present encouraging simulation results; its analysis remains an open problem.
symposium on the theory of computing | 2013
David Eisenstat; Philip N. Klein
We give simple linear-time algorithms for two problems in planar graphs: max st-flow in directed graphs with unit capacities, and multiple-source shortest paths in undirected graphs with unit lengths.
algorithmic learning theory | 2010
Dana Angluin; David Eisenstat; Leonid Kontorovich; Lev Reyzin
We show that random DNF formulas, random log-depth decision trees and random deterministic finite acceptors cannot be weakly learned with a polynomial number of statistical queries with respect to an arbitrary distribution on examples.
Information Processing Letters | 2007
David Eisenstat; Dana Angluin
The known O(dklogk) bound on the VC dimension of k-fold unions or intersections of a given concept class with VC dimension d is shown to be asymptotically tight.
acm symposium on parallel algorithms and architectures | 2010
James Aspnes; David Eisenstat; Yitong Yin
We consider the problem of minimizing contention in static dictionary data structures, where the contention on each cell is measured by the expected number of probes to that cell given an input that is chosen from a distribution that is not known to the query algorithm (but that may be known when the data structure is built). When all positive queries are equally probable, and similarly all negative queries are equally probable, we show that it is possible to construct a data structure using linear space s, a constant number of queries, and with contention O(1/s) on each cell, corresponding to a nearly-flat load distribution. All of these quantities are asymptotically optimal. For arbitrary query distributions, the lack of knowledge of the query distribution by the query algorithm prevents perfect load leveling in this case: we present a lower bound, based on VC-dimension, that shows that for a wide range of data structure problems, achieving contention even within a polylogarithmic factor of optimal requires a cell-probe complexity of Ω(log log n).
Information Processing Letters | 2009
David Eisenstat
We show that 2 is the minimum VC dimension of a concept class whose k-fold union has VC dimension @W(klogk).
international conference on stabilization safety and security of distributed systems | 2010
Dana Angluin; James Aspnes; Rida A. Bazzi; Jiang Chen; David Eisenstat; Goran Konjevod
We consider the question of how much information can be stored by labeling the vertices of a connected undirected graph G using a constant-size set of labels, when isomorphic labelings are not distinguishable. An exact information-theoretic bound is easily obtained by counting the number of isomorphism classes of labelings of G, which we call the information-theoretic capacity of the graph. More interesting is the effective capacity of members of some class of graphs, the number of states distinguishable by a Turing machine that uses the labeled graph itself in place of the usual linear tape. We show that the effective capacity equals the information-theoretic capacity up to constant factors for trees, random graphs with polynomial edge probabilities, and bounded-degree graphs.
Distributed Computing | 2007
Dana Angluin; James Aspnes; David Eisenstat; Eric Ruppert
Lecture Notes in Computer Science | 2006
Dana Angluin; James Aspnes; David Eisenstat
symposium on discrete algorithms | 2012
David Eisenstat; Philip N. Klein; Claire Mathieu