Brendan Juba
Harvard University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Brendan Juba.
international conference on machine learning | 2006
Brendan Juba
We show that it is possible to use data compression on independently obtained hypotheses from various tasks to algorithmically provide guarantees that the tasks are sufficiently related to benefit from multitask learning. We give uniform bounds in terms of the empirical average error for the true average error of the n hypotheses provided by deterministic learning algorithms drawing independent samples from a set of n unknown computable task distributions over finite sets.
Journal of the ACM | 2012
Oded Goldreich; Brendan Juba; Madhu Sudan
We put forward a general theory of goal-oriented communication, where communication is not an end in itself, but rather a means to achieving some goals of the communicating parties. Focusing on goals provides a framework for addressing the problem of potential “misunderstanding” during communication, where the misunderstanding arises from lack of initial agreement on what protocol and/or language is being used in communication. In this context, “reliable communication” means overcoming any initial misunderstanding between parties towards achieving a given goal. Despite the enormous diversity among the goals of communication, we propose a simple model that captures all goals. In the simplest form of communication we consider, two parties, a user and a server, attempt to communicate with each other in order to achieve some goal of the user. We show that any goal of communication can be modeled mathematically by introducing a third party, which we call the referee, who hypothetically monitors the conversation between the user and the server and determines whether or not the goal has been achieved. Potential misunderstanding between the players is captured by allowing each player (the user/server) to come from a (potentially infinite) class of players such that each player is unaware which instantiation of the other it is talking to. We identify a main concept, which we call sensing, that allows goals to be achieved even under misunderstanding. Informally, sensing captures the users ability (potentially using help from the server) to simulate the referees assessment on whether the communication is achieving the goal. We show that when the user can sense progress, the goal of communication can be achieved despite initial misunderstanding. We also show that in certain settings sensing is necessary for overcoming such initial misunderstanding. Our results significantly extend the scope of the investigation started by Juba and Sudan (STOC 2008) who studied the foregoing phenomenon in the case of a single specific goal. Our study shows that their main suggestion, that misunderstanding can be detected and possibly corrected by focusing on the goal, can be proved in full generality.
international joint conference on artificial intelligence | 2017
Mithun Chakraborty; Kai Yee Phoebe Chua; Sanmay Das; Brendan Juba
In this paper, we introduce a multi-agent multiarmed bandit-based model for ad hoc teamwork with expensive communication. The goal of the team is to maximize the total reward gained from pulling arms of a bandit over a number of epochs. In each epoch, each agent decides whether to pull an arm, or to broadcast the reward it obtained in the previous epoch to the team and forgo pulling an arm. These decisions must be made only on the basis of the agent’s private information and the public information broadcast prior to that epoch. We first benchmark the achievable utility by analyzing an idealized version of this problem where a central authority has complete knowledge of rewards acquired from all arms in all epochs and uses a multiplicative weights update algorithm for allocating arms to agents. We then introduce an algorithm for the decentralized setting that uses a value-ofinformation based communication strategy and an exploration-exploitation strategy based on the centralized algorithm, and show experimentally that it converges rapidly to the performance of the centralized method.
Journal of Computer and System Sciences | 2018
Mahdi Cheraghchi; Elena Grigorescu; Brendan Juba; Karl Wimmer; Ning Xie
Abstract AC 0 ∘ MOD 2 circuits are AC 0 circuits augmented with a layer of parity gates just above the input layer. We study AC 0 ∘ MOD 2 circuit lower bounds for computing the Boolean Inner Product functions. Recent works by Servedio and Viola (ECCC TR12-144) and Akavia et al. (ITCS 2014) have highlighted this problem as a frontier problem in circuit complexity that arose both as a first step towards solving natural special cases of the matrix rigidity problem and as a candidate for constructing pseudorandom generators of minimal complexity. We give the first superlinear lower bound for the Boolean Inner Product function against AC 0 ∘ MOD 2 of depth four or greater. Specifically, we prove a superlinear lower bound for circuits of arbitrary constant depth, and an Ω ˜ ( n 2 ) lower bound for the special case of depth-4 AC 0 ∘ MOD 2 .
international joint conference on artificial intelligence | 2017
Roni Stern; Brendan Juba
In this paper we explore the theoretical boundaries of planning in a setting where no model of the agents actions is given. Instead of an action model, a set of successfully executed plans are given and the task is to generate a plan that is safe, i.e., guaranteed to achieve the goal without failing. To this end, we show how to learn a conservative model of the world in which actions are guaranteed to be applicable. This conservative model is then given to an off-the-shelf classical planner, resulting in a plan that is guaranteed to achieve the goal. However, this reduction from a model-free planning to a model-based planning is not complete: in some cases a plan will not be found even when such exists. We analyze the relation between the number of observed plans and the likelihood that our conservative approach will indeed fail to solve a solvable problem. Our analysis show that the number of trajectories needed scales gracefully.
Closed Loop Neuroscience, 2016, ISBN 978-0-12-802452-2, págs. 131-144 | 2016
Brendan Juba
At present, the brain is viewed primarily as a biological computer. But, crucially, the plasticity of the brains structure leads it to vary in functionally significant ways across individuals. Understanding the brain necessitates an understanding of the range of such variation. For example, the number of neurons in the brain and its finer structures impose inherent limitations on the functionality it can realize. The relationship between such quantitative limits on the resources available and the computations that are feasible with such resources is the subject of study in computational complexity theory. Computational complexity is a potentially useful conceptual framework because it enables the meaningful study of the family of possible structures as a whole—the study of “ the brain ,” as opposed to some particular brain. The language of computational complexity also provides a means of formally capturing capabilities of the brain, which may otherwise be philosophically thorny.
conference on innovations in theoretical computer science | 2015
Brendan Juba
We consider the proof search (--automatizability--) problem for propositional proof systems in the context of knowledge discovery (or data mining and analytics). Discovered knowledge necessarily features a weaker semantics than usually employed in mathematical logic, and in this work we find that these weaker semantics may result in a proof search problem that seems easier than the classical problem, but that is nevertheless nontrivial. Specifically, if we consider a knowledge discovery task corresponding to the unsupervised learning of parities over the uniform distribution from partial information, then we find the following: Proofs in the system polynomial calculus with resolution (PCR) can be detected in quasipolynomial time, in contrast to the nO(√n)-time best known algorithm for classical proof search for PCR. By contrast, a quasipolynomial time algorithm that distinguishes whether a formula of PCR is satisfied a 1-ε fraction of the time or merely an -ε-fraction of the time (for polynomially small -ε would give a randomized quasipolynomial time algorithm for NP, so the use of the promise of a small PCR proof is essential in the above result. Likewise, if integer factoring requires subexponential time, we find that bounded-depth Frege proofs cannot be detected in quasipolynomial time. The final result essentially shows that negative results based on the hardness of interpolation [31, 13, 11] persist under this new semantics, while the first result suggests, in light of negative results for PCR [22] and resolution [2] under the classical semantics, that there are intriguing new possibilities for proof search in the context of knowledge discovery and data analysis.
allerton conference on communication, control, and computing | 2015
Mark Braverman; Brendan Juba
We consider the problem of one-way communication when the recipient does not know exactly the distribution that the messages are drawn from, but has a “prior” distribution that is known to be close to the source distribution, a problem first considered by Juba et al. [5]. This problem generalizes the classical source coding problem in information theory, in which the receiver knows the source distribution exactly, that was first considered in Shannons work [6]. This “uncertain priors” coding problem was intended to illuminate aspects of natural language communication, and has applications to adaptive compression schemes. We consider the question of how much longer the messages need to be in order to cope with the uncertainty that the sender and receiver have about the receivers prior and the source distribution, respectively, as compared to the source coding problem. We obtain lower bounds for one-way communication using uncertain priors that are tight up to low-order terms. Specifically, we consider two variants of the uncertain priors problem. First, we consider the original setting of Juba et al. [5] in which the receiver is required to correctly recover the message with probability 1. We find that in this setting, the overhead of 2 log α + O(1) bits achieved by that scheme when the prior is α-close to the source is optimal up to an additive O(log log α) bits. We also consider a setting introduced in the work of Haramaty and Sudan [3], in which the receiver is permitted to fail to recover the message with some positive probability ε. In this setting, we find that the optimal overhead is essentially log α + log 1/ε bits by presenting both a variant of the coding scheme of Juba et al. with an overhead of log α+log 1/ε+1 bits, and a lower bound that matches up to an additive O(log log α) bits. Our lower bounds are obtained by observing that a worst-case, one-way communication complexity problem can be embedded in the sources and priors that any any uncertain priors coding scheme must address.
pervasive computing and communications | 2013
Brendan Juba
We give an overview of a theory of semantic communication proposed by Goldreich, Juba, and Sudan. The theory is intended to capture the obstacles that arise when a diverse population of independently designed devices must communicate with one another. The aim of the theory is to provide conceptual foundations for the design and evaluation of devices that are compatible with such a diverse population. Conclusions drawn from the theory (i) identify a kind of information-sensing that is inherently necessary for compatibility whenever the population is sufficiently diverse and (ii) identify tensions between the richness of diversity and the computational cost of coping with such diversity in a population. We will review how these considerations are reflected in the formulation and design of an example application, a self-patching packet network stack. In particular, this application will illustrate the utility of explicit consideration of various computational complexity measures in addressing both (i) and (ii). We will also review work aimed at identifying kinds of populations across which compatibility can be achieved efficiently.
principles of distributed computing | 2011
Brendan Juba
We present the first end-user protocol, guaranteeing the delivery of messages, that automatically adapts to any new packet format that is obtained by applying a short, efficient function to packets from an earlier protocol.