Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alexei Kaltchenko is active.

Publication


Featured researches published by Alexei Kaltchenko.


Advances in Mathematics of Communications | 2008

Entropy estimators with almost sure convergence and an O(n -1 ) variance

Alexei Kaltchenko; Nina Timofeeva

The problem of the estimation of the entropy rate of a stationary ergodic process mu is considered. A new nonparametric entropy rate estimator is constructed for a sample of n sequences (X<sub>1</sub> <sup>(1)</sup>,...,X<sub>m</sub> <sup>(1)</sup> ),..., (X<sub>n</sub> <sup>(1)</sup> ,....,X<sub>m</sub> <sup>(n)</sup>) independently generated by mu. It is shown that, for m = O(log n), the estimator converges almost surely and its variance is upper-bounded by O(n<sup>-1</sup>) for a large class of stationary ergodic processes with a finite state space. As the order O(n<sup>-1</sup>) of the variance growth on n is the same as that of the optimal Cramer-Rao lower bound, presented is the first near-optimal estimator in the sense of the variance convergence.


International Journal of Information and Coding Theory | 2009

Bias reduction via linear combination of nearest neighbour entropy estimators

Alexei Kaltchenko; Nina Timofeeva

The problem of entropy estimation of stationary ergodic processes is considered. A new family of entropy estimators is constructed as a linear combination of the nearest neighbour estimators with a new metric. The consistency of the new estimators is established for the broad class of measures. The O (n-b)-efficiency of these estimators is established for symmetric probability measures, where b > 0 is a constant and n is the number of observations.


Physical Review A | 2008

Reexamination of Quantum Data Compression and Relative Entropy

Alexei Kaltchenko

B. Schumacher and M. Westmoreland have established a quantum analog of a well-known classical information theory result on a role of relative entropy as a measure of non-optimality in (classical) data compression. In this paper, we provide an alternative, simple and constructive proof of this result by constructing quantum compression codes (schemes) from classical data compression codes. Moreover, as the quantum data compression/coding task can be effectively reduced to a (quasi)classical one, we show that relevant results from classical information theory and data compression become applicable and therefore can be extended to the quantum domain.


International Journal of Bifurcation and Chaos | 2008

BIAS REDUCTION OF THE NEAREST NEIGHBOR ENTROPY ESTIMATOR

Alexei Kaltchenko; Nina Timofeeva; Eugeniy A. Timofeev

A new family of entropy estimators, constructed as a linear combination (weighted average) of nearest neighbor estimators with slightly different individual properties, is proposed. It is shown that a special sub-optimal selection of the coefficients in the linear combination results in a reduction of the estimatorpsilas bias. Computer simulation results are provided.


information theory workshop | 2007

Entropy Estimators with Almost Sure Convergence and an O(n-1) Variance

Alexei Kaltchenko; En-hui Yang; Nina Timofeeva

The problem of the estimation of the entropy rate of a stationary ergodic process mu is considered. A new nonparametric entropy rate estimator is constructed for a sample of n sequences (X1 (1),...,Xm (1) ),..., (Xn (1) ,....,Xm (n)) independently generated by mu. It is shown that, for m = O(log n), the estimator converges almost surely and its variance is upper-bounded by O(n-1) for a large class of stationary ergodic processes with a finite state space. As the order O(n-1) of the variance growth on n is the same as that of the optimal Cramer-Rao lower bound, presented is the first near-optimal estimator in the sense of the variance convergence.


international symposium on information theory | 2004

Universal non-destructive estimation of quantum relative entropy

Alexei Kaltchenko

This paper describes the universal nondestructive estimation of quantum relative entropy. The sequences with the independent realizations of two i.i.d sources over a finite alphabet and the corresponding marginal distributions with the universal estimation is used to define the function quantum relative entropy. A product state emitted by a quantum i.i.d source with density matrix is subjected to a unitary transformation with a computational basis. The quantum entropy of a source and the eigenbasis without compromising the fidelity is learnt. With the arbitrary high accuracy the fidelity arbitrary is close to the unity


Proceedings of SPIE | 2014

Efficiency of nearest neighbor entropy estimators for Bernoulli measures

Evgeniy A. Timofeev; Alexei Kaltchenko

A problem of nonparametric entropy estimation for discrete stationary ergodic processes is considered. The estimation is based on so-called ”nearest-neighbor method”. It is shown that, for Bernoulli measures, the estimator is unbiased, i.e. converges to the (inverse) entropy of the process. Moreover, for symmetric Bernoulli measures, the unbiased estimator can be explicitly constructed.


Proceedings of SPIE | 2014

Improving the efficiency of nonparametric entropy estimation

Evgeniy A. Timofeev; Alexei Kaltchenko

A problem of improving the efficiency of nonparametric entropy estimation for discrete stationary ergodic processes is considered. The estimation depends on selection of underlying metric on the space of right-sided infinite sequences. Proposed is a new family of metrics which depend on a set of parameters. The estimator is linearly dependent on the parameters, and the best accuracy is achieved by solution of a system of linear equations.


Advances in Mathematics of Communications | 2014

Nearest-neighbor entropy estimators with weak metrics

Evgeniy A. Timofeev; Alexei Kaltchenko

A problem of improving the accuracy of nonparametric entropy estimation for a stationary ergodic process is considered. New weak metrics are introduced and relations between metrics, measures, and entropy are discussed. Based on weak metrics, a new nearest-neighbor entropy estimator is constructed and has a parameter with which the estimator is optimized to reduce its bias. It is shown that estimators variance is upper-bounded by a nearly optimal Cramer-Rao lower bound.


Proceedings of SPIE | 2013

Entropy estimation and Fibonacci numbers

Evgeniy A. Timofeev; Alexei Kaltchenko

We introduce a new metric on a space of right-sided infinite sequences drawn from a finite alphabet. Emerging from a problem of entropy estimation of a discrete stationary ergodic process, the metric is important on its own part and exhibits some interesting properties. Notably, the number of distinct metric values for a set of sequences of length m is equal to Fm+3 − 1, where Fm is a Fibonacci number.

Collaboration


Dive into the Alexei Kaltchenko's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nina Timofeeva

Wilfrid Laurier University

View shared research outputs
Top Co-Authors

Avatar

En-hui Yang

Wilfrid Laurier University

View shared research outputs
Top Co-Authors

Avatar

Nina Timofeeva

Wilfrid Laurier University

View shared research outputs
Top Co-Authors

Avatar

Oleg Semenov

Wilfrid Laurier University

View shared research outputs
Researchain Logo
Decentralizing Knowledge