Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Shengyu Zhu is active.

Publication


Featured researches published by Shengyu Zhu.


wireless communications and networking conference | 2013

Interactive distributed detection with conditionally independent observations

Shengyu Zhu; Earnest Akofor; Biao Chen

This paper deals with interactive distributed detection with conditionally independent observations where the fusion center may exchange information with a local sensor. Using a two sensor system, we demonstrate that this two-way interaction provides improvement in detection performance compared with the classical tandem detection system where only one-way communication is allowed. An important observation is that, contrary to that of the tandem network, the fusion rule is no longer a simple likelihood ratio test due to the correlation introduced in the initial feedback from the fusion center to the sensor.


IEEE Transactions on Signal Processing | 2014

Decentralized Data Reduction With Quantization Constraints

Ge Xu; Shengyu Zhu; Biao Chen

A guiding principle for data reduction in statistical inference is the sufficiency principle. This paper extends the classical sufficiency principle to decentralized inference, i.e., data reduction needs to be achieved in a decentralized manner. We examine the notions of local and global sufficient statistics and the relationship between the two for decentralized inference under different observation models. We then consider the impact of quantization on decentralized data reduction, which is often needed when communications among sensors are subject to finite capacity constraints. The central question we intend to ask is: if each node in a decentralized inference system has to summarize its data using a finite number of bits, is it still optimal to implement data reduction using global sufficient statistics prior to quantization? We show that the answer is negative using a simple example and proceed to identify conditions under which sufficiency based data reduction followed by quantization is indeed optimal. They include the well known case when the data at decentralized nodes are conditionally independent as well as a class of problems with conditionally dependent observations that admit conditional independence structure through the introduction of an appropriately chosen hidden variable.


asilomar conference on signals, systems and computers | 2013

Are global sufficient statistics always sufficient: The impact of quantization on decentralized data reduction

Shengyu Zhu; Ge Xu; Biao Chen

The sufficiency principle is the guiding principle for data reduction for various statistical inference problems. There has been recent effort in developing the sufficiency principle for decentralized inference with a particular emphasis on studying the relationship between global sufficient statistics and local sufficient statistics. We consider in this paper the impact of quantization on decentralized data reduction. The central question we intend to ask is: if each node in a decentralized inference system has to summarize its data using a finite number of bits, is it still sufficient to implement data reduction using global sufficient statistics prior to quantization? We show that the answer is negative using a simple example and proceed to identify conditions when global sufficient statistics based data reduction is indeed optimal. They include the well known case when the data at decentralized nodes are conditionally independent as well as a class of problems with conditionally dependent data.


international workshop on signal processing advances in wireless communications | 2016

Distributed average consensus with bounded quantization

Shengyu Zhu; Biao Chen

This paper considers distributed average consensus using bounded quantizers with potentially unbounded input data. We develop a quantized consensus algorithm based on a distributed alternating direction methods of multipliers (ADMM) algorithm. It is shown that, within finite iterations, all the agent variables either converge to the same quantization point or cycle with a finite period. In the convergent case, we derive a consensus error bound which also applies to that of the unbounded rounding quantizer provided that the desired average lies within quantizer output range. Simulations show that the proposed algorithm almost always converge when the network becomes large and dense.


international conference on acoustics, speech, and signal processing | 2016

Quantized consensus ADMM for multi-agent distributed optimization

Shengyu Zhu; Mingyi Hong; Biao Chen

This paper considers multi-agent distributed optimization with quantized communication which is needed when inter-agent communications are subject to finite capacity and other practical constraints. To minimize the global objective formed by a sum of local convex functions, we develop a quantized distributed algorithm based on the alternating direction method of multipliers (ADMM). Under certain convexity assumptions, it is shown that the proposed algorithm converges to a consensus within log1+η Ω iterations, where η > 0 depends on the network topology and the local objectives, and O is a polynomial fraction depending on the quantization resolution, the distance between initial and optimal variable values, the local objectives, and the network topology. We also obtain a tight upper bound on the consensus error which does not depend on the size of the network.


international symposium on information theory | 2016

Distributed detection over connected networks via one-bit quantizer

Shengyu Zhu; Biao Chen

This paper considers distributed detection over large scale connected networks with arbitrary topology. Contrasting to the canonical parallel fusion network where a single node has access to the outputs from all other sensors, each node can only exchange one-bit information with its direct neighbors in the present setting. Our approach adopts a novel consensus reaching algorithm using asymmetric bounded quantizers that allow controllable consensus error. Under the Neyman-Pearson criterion, we show that, with each sensor employing an identical one-bit quantizer for local information exchange, this approach achieves the optimal error exponent of centralized detection provided that the algorithm converges. Simulations show that the algorithm converges when the network is large enough.


international conference on signal and information processing | 2013

Data reduction in tandem fusion systems

Shengyu Zhu; Biao Chen

The sufficiency principle is the guiding principle for data reduction in statistical inference. There has been recent effort in developing the sufficiency principle for decentralized inference with a particular emphasis on the relationship between global sufficiency and local sufficiency. This paper studies the sufficiency based data reduction in tandem fusion systems when quantization is needed. We identify conditions such that it is optimal to implement data reduction using sufficient statistics prior to the quantization. They include the well known case when the data at decentralized nodes are conditionally independent as well as a class of problems with conditionally dependent data.


IEEE Transactions on Signal Processing | 2016

Quantized Consensus by the ADMM: Probabilistic Versus Deterministic Quantizers

Shengyu Zhu; Biao Chen


Archive | 2016

Distributed Average Consensus with Bounded Quantizer and Unbounded Input

Shengyu Zhu; Biao Chen


ieee global conference on signal and information processing | 2015

Distributed average consensus with deterministic quantization: An ADMM approach

Shengyu Zhu; Biao Chen

Collaboration


Dive into the Shengyu Zhu's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ge Xu

Syracuse University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge