Elazar Goldenberg
Weizmann Institute of Science
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Elazar Goldenberg.
foundations of computer science | 2008
Irit Dinur; Elazar Goldenberg
Given a function f : X rarr Sigma, its lscr-wise direct product is the function F = f<sup>lscr</sup> : X<sup>lscr</sup> rarr Sigma<sup>lscr</sup> defined by: F(x<sub>1</sub>,...,x<sub>lscr</sub>) = (f(x<sub>1</sub>),...,f(x<sub>lscr</sub>)). We are interested in the local testability of the direct product encoding (mapping f rarr f<sup>lscr</sup>). Namely, given an arbitrary function F : X<sup>lscr</sup> rarr Sigma<sup>lscr</sup>, we wish to determine how close it is to f<sup>lscr</sup> for some f : X rarr Sigma, by making two random queries into F. In this work we analyze the case of low acceptance probability of the test. We show that even if the test passes with small probability, epsiv>0, already F must have a non-trivial structure and in particular must agree with some f<sup>lscr</sup> on nearly epsiv of the domain. Moreover, we give a structural characterization of all functions F on which the test passes with probability epsiv. Our results can be viewed as a combinatorial analog of the low error dasialow degree testpsila, that is used in PCP constructions.
symposium on the theory of computing | 2016
Diptarka Chakraborty; Elazar Goldenberg; Michal Koucký
The Hamming and the edit metrics are two common notions of measuring distances between pairs of strings x,y lying in the Boolean hypercube. The edit distance between x and y is defined as the minimum number of character insertion, deletion, and bit flips needed for converting x into y. Whereas, the Hamming distance between x and y is the number of bit flips needed for converting x to y. In this paper we study a randomized injective embedding of the edit distance into the Hamming distance with a small distortion. We show a randomized embedding with quadratic distortion. Namely, for any x,y satisfying that their edit distance equals k, the Hamming distance between the embedding of x and y is O(k2) with high probability. This improves over the distortion ratio of O( n * n) obtained by Jowhari (2012) for small values of k. Moreover, the embedding output size is linear in the input size and the embedding can be computed using a single pass over the input. We provide several applications for this embedding. Among our results we provide a one-pass (streaming) algorithm for edit distance running in space O(s) and computing edit distance exactly up-to distance s1/6. This algorithm is based on kernelization for edit distance that is of independent interest.
Random Structures and Algorithms | 2017
Roee David; Elazar Goldenberg; Robert Krauthgamer
We study the problem of reconstructing a low-rank matrix, where the input is an n × m matrix M over a field F and the goal is to reconstruct a (near-optimal) matrix M′ that is low-rank and close to M under some distance function Δ. Furthermore, the reconstruction must be local, i.e., provides access to any desired entry of M′ by reading only a few entries of the input M (ideally, independent of the matrix dimensions n and m). Our formulation of this problem is inspired by the local reconstruction framework of Saks and Seshadhri (SICOMP, 2010). Our main result is a local reconstruction algorithm for the case where Δ is the normalized Hamming distance (between matrices). Given M that is ϵ-close to a matrix of rank d<1/ϵ (together with d and ϵ), this algorithm computes with high probability a rank-d matrix M′ that is O(dϵ)-close to M. This is a local algorithm that proceeds in two phases. The preprocessing phase reads only O˜(d/ϵ3) random entries of M, and stores a small data structure. The query phase deterministically outputs a desired entry M′i,j by reading only the data structure and 2d additional entries of M. We also consider local reconstruction in an easier setting, where the algorithm can read an entire matrix column in a single operation. When Δ is the normalized Hamming distance between vectors, we derive an algorithm that runs in polynomial time by applying our main result for matrix reconstruction. For comparison, when Δ is the truncated Euclidean distance and F=ℝ, we analyze sampling algorithms by using statistical learning tools. A preliminary version of this paper appears appears in ECCC, see: http://eccc.hpi-web.de/report/2015/128/
conference on innovations in theoretical computer science | 2015
Roee David; Irit Dinur; Elazar Goldenberg; Guy Kindler; Igor Shinkar
Electronic Colloquium on Computational Complexity | 2015
Diptarka Chakraborty; Elazar Goldenberg; Michal Koucký
foundations of computer science | 2018
Diptarka Chakraborty; Debarati Das; Elazar Goldenberg; Michal Koucky; Michael E. Saks
arXiv: Data Structures and Algorithms | 2016
Diptarka Chakraborty; Elazar Goldenberg; Michal Koucký
Electronic Colloquium on Computational Complexity | 2015
Roee David; Elazar Goldenberg; Robert Krauthgamer
Electronic Colloquium on Computational Complexity | 2014
Roee David; Irit Dinur; Elazar Goldenberg; Guy Kindler; Igor Shinkar
international colloquium on automata languages and programming | 2013
Irit Dinur; Elazar Goldenberg