Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sudeep Kamath is active.

Publication


Featured researches published by Sudeep Kamath.


2011 International Symposium on Networking Coding | 2011

Generalized Network Sharing Outer Bound and the Two-Unicast Problem

Sudeep Kamath; David Tse; Venkat Anantharam

We describe a simple improvement over the Network Sharing outer bound for the multiple unicast problem. We call this the Generalized Network Sharing (GNS) outer bound. We note two properties of this bound with regard to the two-unicast problem: a) it is the tightest bound that can be realized using only edge-cut bounds and b) it is tight in the special case when all edges except those from a so-called minimal GNS set have sufficiently large capacities. Finally, we present an example showing that the GNS outer bound is not tight for the two-unicast problem.


allerton conference on communication, control, and computing | 2012

Non-interactive simulation of joint distributions: The Hirschfeld-Gebelein-Rényi maximal correlation and the hypercontractivity ribbon

Sudeep Kamath; Venkat Anantharam

We consider the following problem: Alice and Bob observe sequences Xn and Y n respectively where {(Xi, Yi)}i=1∞ are drawn i.i.d. from P(x, y), and they output U and V respectively which is required to have a joint law that is close in total variation to a specified Q(u, v). One important technique to establish impossibility results for this problem is the Hirschfeld-Gebelein-Rényi maximal correlation which was considered by Witsen-hausen [1]. Hypercontractivity studied by Ahlswede and Gács [2] and reverse hypercontractivity recently studied by Mossel et al. [3] provide another approach for proving impossibility results. We consider the tightest impossibility results that can be obtained using hypercontractivity and reverse hypercontractivity and provide a necessary and sufficient condition on the source distribution P(x, y) for when this approach subsumes the maximal correlation approach. We show that the binary pair source distribution with symmetric noise satisfies this condition.


allerton conference on communication, control, and computing | 2013

On hypercontractivity and the mutual information between Boolean functions

Venkat Anantharam; Amin Gohari; Sudeep Kamath; Chandra Nair

Hypercontractivity has had many successful applications in mathematics, physics, and theoretical computer science. In this work we use recently established properties of the hypercontractivity ribbon of a pair of random variables to study a recent conjecture regarding the mutual information between binary functions of the individual marginal sequences of a sequence of pairs of random variables drawn from a doubly symmetric binary source.


international symposium on information theory | 2014

On hypercontractivity and a data processing inequality.

Venkat Anantharam; Amin Gohari; Sudeep Kamath; Chandra Nair

In this paper we provide the correct tight constant to a data-processing inequality claimed by Erkip and Cover. The correct constant turns out to be a particular hypercontractivity parameter of (X,Y), rather than their squared maximal correlation. We also provide alternate geometric characterizations for both maximal correlation as well as the hypercontractivity parameter that characterizes the data-processing inequality.


international symposium on information theory | 2011

Two unicast information flows over linear deterministic networks

I-Hsiang Wang; Sudeep Kamath; David Tse

We investigate the two unicast flow problem over layered linear deterministic networks with arbitrary number of nodes. When the minimum cut value between each source-destination pair is constrained to be 1, it is obvious that the triangular rate region {(R<inf>1</inf>, R<inf>2</inf>) ∶ R<inf>1</inf>, R<inf>2</inf> ≥ 0, R<inf>1</inf> + R<inf>2</inf> ≤ 1} can be achieved, and that one cannot achieve beyond the square rate region {(R<inf>1</inf>, R<inf>2</inf>) ∶ R<inf>1</inf>, R<inf>2</inf> ≥ 0, R<inf>1</inf> ≤ 1, R<inf>2</inf> ≤ 1{. Analogous to the work by Wang and Shroff for wired networks [1], we provide the necessary and sufficient conditions for the capacity region to be the triangular region and the necessary and sufficient conditions for it to be the square region. Moreover, we completely characterize the capacity region and conclude that there are exactly three more possible capacity regions of this class of networks, in contrast to the result in wired networks where only two rate regions are possible. Our achievability scheme is based on linear coding over an extension field with at most four nodes performing special linear coding operations, namely interference neutralization and zero forcing, while all other nodes perform random linear coding.


international symposium on information theory | 2008

On distributed function computation in structure-free random networks

Sudeep Kamath; D. Manjunath

We consider in-network computation of MAX in a structure-free random multihop wireless network. Nodes do not know their relative or absolute locations and use the Aloha MAC protocol. For one-shot computation, we describe a protocol in which the MAX value becomes available at the origin in O(radicn/ log n) slots with high probability. This is within a constant factor of that required by the best coordinated protocol. A minimal structure (knowledge of hop-distance from the sink) is imposed on the network and with this structure, we describe a protocol for pipelined computation of MAX that achieves a rate of Omega(1/(log2 n)).


international symposium on information theory | 2014

Two-unicast is hard

Sudeep Kamath; David Tse; Chih-Chun Wang

Consider the k-unicast network coding problem over an acyclic wireline network: Given a rate vector k-tuple, determine whether the network of interest can support k unicast flows with those rates. It is well known that the one-unicast problem is easy and that it is solved by the celebrated max-flow min-cut theorem. The hardness of k-unicast problems with small k has been an open problem. We show that the two-unicast problem is as hard as any k-unicast problem for k ≥ 3. Our result suggests that the difficulty of a network coding instance is related more to the magnitude of the rates in the rate tuple than to the number of unicast sessions. As a consequence of our result and other well-known results, we show that linear coding is insufficient to achieve capacity, and non-Shannon inequalities are necessary for characterizing capacity, even for two-unicast networks.


allerton conference on communication, control, and computing | 2010

A new dual to the Gács-Körner common information defined via the Gray-Wyner system

Sudeep Kamath; Venkat Anantharam

We consider jointly distributed random variables X and Y. After describing the Gács-Körner common information between the random variables from the viewpoint of the capacity region of the Gray-Wyner system, we propose a new notion of common information between the random variables that is dual to the Gács-Körner common information from this viewpoint in a well-defined sense. We characterize this quantity explicitly in terms of two auxiliary quantities that are asymmetric in nature, and illustrate the operational significance of these new quantities by characterizing a corner point of the solution to a problem of source coding with side-information in terms of them. We also contrast this new concept of common information for a pair of random variables with the Wyner common information of the random variables, which is also a kind of dual to the Gács-Körner common information.


conference on information sciences and systems | 2016

An operational measure of information leakage

Ibrahim Issa; Sudeep Kamath; Aaron B. Wagner

Given two discrete random variables X and Y, an operational approach is undertaken to quantify the “leakage” of information from X to Y. The resulting measure ℒ(X→Y ) is called maximal leakage, and is defined as the multiplicative increase, upon observing Y, of the probability of correctly guessing a randomized function of X, maximized over all such randomized functions. It is shown to be equal to the Sibson mutual information of order infinity, giving the latter operational significance. Its resulting properties are consistent with an axiomatic view of a leakage measure; for example, it satisfies the data processing inequality, it is asymmetric, and it is additive over independent pairs of random variables. Moreover, it is shown that the definition is robust in several respects: allowing for several guesses or requiring the guess to be only within a certain distance of the true function value does not change the resulting measure.


IEEE Transactions on Information Theory | 2014

On Distributed Function Computation in Structure-Free Random Wireless Networks

Sudeep Kamath; D. Manjunath; Ravi R. Mazumdar

We consider in-network computation of MAX and the approximate histogram in an n-node structure-free random multihop wireless network. The key assumption that we make is that the nodes do not know their relative or absolute locations and that they do not have an identity. For the Aloha MAC protocol, we first describe a protocol in which the MAX value becomes available at the origin in O(√{n/logn}) slots (bit-periods) with high probability. This is within a constant factor of that required by the best coordinated protocol. A minimal structure (knowledge of hop-distance from the sink) is imposed on the network and with this structure, we describe a protocol for pipelined computation of MAX that achieves a rate of Ω(1/(logn)2). Finally, we show how the protocol for computation of MAX can be modified to achieve approximate computation of the histogram. The approximate histogram can be computed in O(n7/2(logn)1/2) bit-periods with high probability.

Collaboration


Dive into the Sudeep Kamath's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chandra Nair

The Chinese University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar

Sreeram Kannan

University of Washington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

D. Manjunath

Indian Institute of Technology Bombay

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge