Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Marco is active.

Publication


Featured researches published by Daniel Marco.


information processing in sensor networks | 2003

On the many-to-one transport capacity of a dense wireless sensor network and the compressibility of its data

Daniel Marco; Enrique J. Duarte-Melo; Mingyan Liu; David L. Neuhoff

In this paper we investigate the capability of large-scale sensor networks to measure and transport a two-dimensional field. We consider a data-gathering wireless sensor network in which densely deployed sensors take periodic samples of the sensed field, and then scalar quantize, encode and transmit them to a single receiver/central controller where snapshot images of the sensed field are reconstructed. The quality of the reconstructed field is limited by the ability of the encoder to compress the data to a rate less than the single-receiver transport capacity of the network. Subject to a constraint on the quality of the reconstructed field, we are interested in how fast data can be collected (or equivalently how closely in time these snapshots can be taken) due to the limitation just mentioned. As the sensor density increases to infinity, more sensors send data to the central controller. However, the data is more correlated, and the encoder can do more compression. The question is: Can the encoder compress sufficiently to meet the limit imposed by the transport capacity? Alternatively, how long does it take to transport one snapshot? We show that as the density increases to infinity, the total number of bits required to attain a given quality also increases to infinity under any compression scheme. At the same time, the single-receiver transport capacity of the network remains constant as the density increases. We therefore conclude that for the given scenario, even though the correlation between sensor data increases as the density increases, any data compression scheme is insufficient to transport the required amount of data for the given quality. Equivalently, the amount of time it takes to transport one snapshot goes to infinity.


IEEE Transactions on Information Theory | 2005

The validity of the additive noise model for uniform scalar quantizers

Daniel Marco; David L. Neuhoff

A uniform scalar quantizer with small step size, large support, and midpoint reconstruction levels is frequently modeled as adding orthogonal noise to the quantizer input. This paper rigorously demonstrates the asymptotic validity of this model when the input probability density function (pdf) is continuous and satisfies several other mild conditions. Specifically, as step size decreases, the correlation between input and quantization error becomes negligible relative to the mean-squared error (MSE). The model is even valid when the input density is discontinuous at the origin, but discontinuities elsewhere can prevent the correlation from being negligible. Though this invalidates the additive model, an asymptotic formula for the correlation is found in terms of the step size and the heights and positions of the discontinuities. For a finite support input density, such as uniform, it is shown that the support of the uniform quantizer can be matched to that of the density in ways that make the correlation approach a variety of limits. The derivations in this paper are based on an analysis of the asymptotic convergence of cell centroids to cell midpoints. This convergence is fast enough that the centroids and midpoints induce the same asymptotic MSE, but not fast enough to induce the same correlations.


IEEE Transactions on Information Theory | 2006

Low-resolution scalar quantization for Gaussian sources and squared error

Daniel Marco; David L. Neuhoff

This correspondence analyzes the low-resolution performance of entropy-constrained scalar quantization. It focuses mostly on Gaussian sources, for which it is shown that for both binary quantizers and infinite-level uniform threshold quantizers, as D approaches the source variance /spl sigma//sup 2/, the least entropy of such quantizers with mean-squared error D or less approaches zero with slope -log/sub 2/e/2/spl sigma//sup 2/. As the Shannon rate-distortion function approaches zero with the same slope, this shows that in the low-resolution region, scalar quantization with entropy coding is asymptotically as good as any coding technique.


IEEE Transactions on Information Theory | 2009

On Lossless Coding With Coded Side Information

Daniel Marco; Michelle Effros

This paper considers the problem, first introduced by Ahlswede and Korner in 1975, of lossless source coding with coded side information. Specifically, let X and Y be two random variables such that X is desired losslessly at the decoder while Y serves as side information. The random variables are encoded independently, and both descriptions are used by the decoder to reconstruct X. Ahlswede and Korner describe the achievable rate region in terms of an auxiliary random variable. This paper gives a partial solution for an optimal auxiliary random variable, thereby describing part of the rate region explicitly in terms of the distribution of X and Y.


international symposium on information theory | 2005

Entropy of quantized data at high sampling rates

Daniel Marco; David L. Neuhoff

This paper considers the entropy of the highly correlated quantized samples resulting from sampling at high rate. Two results are shown. The first concerns sampling and identically scalar quantizing a stationary random process over a finite interval. It is shown that if the process crosses a quantization threshold with positive probability, then the joint entropy of the quantized samples tends to infinity as the sampling interval goes to zero. The second result provides an upper bound to the rate at which the joint entropy tends to infinity, in the case of infinite-level uniform threshold scalar quantizers and a stationary Gaussian random process whose mean lies at a midpoint of some quantization cell. Specifically, an asymptotic formula for the conditional entropy of one quantized sample conditioned on another quantized sample is derived


information theory workshop | 2002

Distributed encoding of sensor data

David L. Neuhoff; Daniel Marco

In this paper, we ignore transmission issues and focus on the total number of bits to transmit to the collector to form a reconstruction of the field with a given MSE. We assume that all sensors can transmit bits to the collector without error. With this assumption, with total number of bits as the cost measure, and with the style of coding, it can be argued that sensor-to-sensor relaying offers no advantages. This problem is similar to image coding and transmission, except that the quantization, encoding and transmission are constrained to take place separately at each sensor (pixel location), in contrast to traditional image coding and transmission, wherein the entire image is available for quantization, encoding, and transmission. Due to the need to separately encode values from separate sensors, we pursue a Slepian-Wolf style coding approach.


information theory workshop | 2006

A Partial Solution for Lossless Source Coding with Coded Side Information

Daniel Marco; Michelle Effros

This paper considers the problem, first introduced by Ahlswede and Körner in 1975, of lossless source coding with coded side information. Specifically, let X and Y be two random variables such that X is desired losslessly at the decoder while Y serves as side information. The random variables are encoded independently, and both descriptions are used by the decoder to reconstruct X. Ahlswede and Körner describe the achievable rate region in terms of an auxiliary random variable. This paper gives a partial solution for the optimal auxiliary random variable, thereby describing part of the rate region explicitly in terms of the distribution of X and Y.


IEEE Transactions on Information Theory | 2007

Low-Resolution Scalar Quantization for Gaussian Sources and Absolute Error

Daniel Marco; David L. Neuhoff

This correspondence considers low-resolution scalar quantization for a memoryless Gaussian source with respect to absolute error distortion. It shows that slope of the operational rate-distortion function of scalar quantization is infinite at the point Dmax where the rate becomes zero. Thus, unlike the situation for squared error distortion, or for Laplacian and exponential sources with squared or absolute error distortion, for a Gaussian source and absolute error, scalar quantization at low rates is far from the Shannon rate-distortion function, i.e., far from the performance of the best lossy coding technique


IEEE Transactions on Information Theory | 2010

Entropy of Highly Correlated Quantized Data

Daniel Marco; David L. Neuhoff

This paper considers the entropy of highly correlated quantized samples. Two results are shown. The first concerns sampling and identically scalar quantizing a stationary continuous-time random process over a finite interval. It is shown that if the process crosses a quantization threshold with positive probability, then the joint entropy of the quantized samples tends to infinity as the sampling rate goes to infinity. The second result provides an upper bound to the rate at which the joint entropy tends to infinity, in the case of an infinite-level uniform threshold scalar quantizer and a stationary Gaussian random process. Specifically, an asymptotic formula for the conditional entropy of one quantized sample conditioned on the previous quantized sample is derived. At high sampling rates, these results indicate a sharp contrast between the large encoding rate (in bits/sec) required by a lossy source code consisting of a fixed scalar quantizer and an ideal, sampling-rate-adapted lossless code, and the bounded encoding rate required by an ideal lossy source code operating at the same distortion.


IEEE Transactions on Information Theory | 2009

Markov Random Processes Are Neither Bandlimited nor Recoverable From Samples or After Quantization

Daniel Marco

This paper considers basic questions regarding Markov random processes. It shows that continuous-time, continuous-valued, wide-sense stationary, Markov processes that have absolutely continuous second-order distribution and finite second moment are not bandlimited. It also shows that continuous-time, stationary, Markov processes that are continuous-valued or discrete-valued and satisfy additional mild conditions cannot be recovered from uniform sampling. Further it shows that continuous-time, continuous-valued, stationary, Markov processes that have absolutely continuous second-order distributions and are continuous almost surely, cannot be recovered without error after quantization. Finally, it provides necessary and sufficient conditions for stationary, discrete-time, Markov processes to have zero entropy rate, and relates this to information singularity.

Collaboration


Dive into the Daniel Marco's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michelle Effros

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mingyan Liu

University of Michigan

View shared research outputs
Researchain Logo
Decentralizing Knowledge