Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where George G. Roussas is active.

Publication


Featured researches published by George G. Roussas.


Annals of the Institute of Statistical Mathematics | 1969

Nonparametric estimation in Markov processes

George G. Roussas

The problem of statistical inferences in Markov processes has received considerable a t tent ion during the last fifteen years. Much of the work consists in carrying over to the Markov case the maximum likelihood and chi-square methods from processes with independent identically dist r ibuted random variables. (See, for example, [1] and other references cited there.) Alternat ive approaches have also been adopted [11], some of which [7] refer to statistical inferences in more general processes. I t is not long ago that presumably the first paper [10] appeared on nonparametr ic estimation of the density in the case of independent identically distr ibuted random variables. Soon a number of others ([14], [8], [13], [3], [6]) followed, which by using either similar or different methods obtained fur ther results. The purpose of the present paper is to consider the nonparametric estimation of densities in the case of Markov processes. The methods being used and results being obtained here are similar to those in [9]. Wha t we do specifically here is th is : We first construct asymptotically unbiased est imates for the initial and (two-dimensional) joint densities. This is done in section 2. In section 3 these est imates are shown to be consistent in quadrat ic mean, and fur thermore a consistent, in the probability sense, est imate for the transit ion density is obtained. Finally, it is proved in section 4 that , under suitable conditions, all three est imators mentioned, properly normalized, are asymptotically normal. The appropriate versions of the Central Limit Theorem which are used for this purpose are s ta ted and proved in an appendix, so tha t the continuity of the paper will not be interrupted.


Stochastic Processes and their Applications | 1990

Nonparametric regression estimation under mixing conditions

George G. Roussas

For j=1, 2,..., let {Zj}={(Xj, Yj)} be a strictly stationary sequence of random variables, where the Xs and the Ys are 1p-valued and 1q-valued, respectively, for some integers p, q[greater-or-equal, slanted]1. Let [phi] be an integrable Borel real-valued function defined on 1q and set 97. The function [phi] need not be bounded. The quantity r(x) is estimated by 22, where fn(x) is a kernel estimate for the probability density function f of the Xs and Rn(x)=(nhp)-1[Sigma]nj=1[phi](Yj) · K((x-Xj)/h). If the sequence {Zj} enjoys any one of the standard four kinds of mixing properties, then, under suitable additional assumptions, rn(x) is strongly consistent, uniformly over compacts. Rates of convergence are also specified.


Stochastic Analysis and Applications | 1987

Moment inequalities for mixing sequences of random variables

George G. Roussas; D.A. Ioannides

In this work, sequences of random variables are considered satisfying certain mixing conditions. After the relevant definitions are presented, some alternative characterizations are discussed. Also, illustrative examples are given for each case considered. Finally , various moment in equalities are extensively discussed in a systematic manner. These equalities are interesting on their own right and also useful in statistical applications. Certain such applications will be presented in a separate report to avoid overloading the present one


Journal of Statistical Planning and Inference | 1988

Nonparametric estimation in mixing sequences of random variables

George G. Roussas

Abstract Let X1,X2,… be random variables defined on (Ω, A ,P) and taking values in R t , t≥1 . Suppose that the sequence {Xj},j≥1, is strictly stationary and φi-mixing, for some i=1,…,4, and let ƒ be the probability density function of X1. Let ƒn(x) be the usual kernel estimate of ƒ(x), xϵ R t. Under certain conditions, it is shown that fn(x) is a strongly consistent estimate of ƒ(x). Under some additional conditions, this consistency is shown to be uniform over certain sets extending over all of R t as the sample size n tends to infinity. These results, specialized to certain Markov processes, provide strongly cosistent estimates, as well as uniformly, as above, strongly consistent estimates, for the initial, the 2t-variate joint and the transition probability density functions. Finally, a uniformly, in the above sense, strongly consistent estimate is obtained for the one-step transition distribution function of the process.


Journal of Multivariate Analysis | 1992

Fixed design regression for time series: asymptotic normality

Lanh Tat Tran; D. A. Ioannides; George G. Roussas

Consider the fixed regression model with general weights, and suppose that the error random variables are coming from a strictly stationary stochastic process, satisfying the strong mixing condition. The asymptotic normality of the proposed estimate is established under weak conditions. The applicability of the results obtained is demonstrated by way of two existing estimates, the Gasser-Muller estimate and that of Priestley and Chao. The asymptotic normality of these estimates is further illustrated by means of a concrete example from the class of autoregressive processes.


Archive | 1991

Nonparametric functional estimation and related topics

George G. Roussas; Related Topics

I. Curve and Functional Estimation.- Reproducing Kernels and Finite Order Kernels.- Laws of the Iterated Logaritm for Density Estimators.- Exponential Inequalities in Nonparametric Estimation.- Conservative Confidence Bands for Nonparametric Regression.- Data-Adaptive Kernel Estimation.- On the Nonparametric Estimation of the Entropy Functional.- II. Curve and Functional Estimation (Continued).- Analysis of Samples of Curves.- Bootstrap Methods in Nonparametric Regression.- On the Influence Function of Maximum Penalized Likelihood Density Estimators.- Nonparametric Curve Estimation and Simple Curve Characteristics.- Applications of Multiparameter Weak Convergence for Adaptive Nonparametric Curve Estimation.- On Asymptotic Efficiency of Average Derivative Estimates.- Nonparametric Estimation of Elliptically Contoured Densities.- Uniform Deconvolution: Nonparametric Maximum Likelihood and Inverse Estimation.- III. Parameter Selection, Smoothing.- Smoothing Parameter Selection in Image Restoration.- Estimating the Quantile-Density Function.- Data-Driven Smoothing Based on Convexity Properties.- Prospects for Automatic Bandwidth Selection in Extensions to Basic Kernel Density Estimation.- Root n Bandwidth Selection.- Estimating Smooth Distribution Functions.- Smoothing Techniques in Time Series Analysis.- IV. Regression Models.- Nonparametric Inference in Heteroskedastic Regression.- Bounded Influence Regression in the Presence of Heteroskedasticity of Unknown Form.- Linear Regression with Randomly Right-Censored Data Using Prior Nonparametric Estimation.- Universal Consistencies of a Regression Estimate for Unbounded Regression Functions.- Minimax Bayes Estimation, Penalized Likelihood Methods, and Restricted Minimax Estimation.- On Exponential Bounds on the Bayes Risk of the Nonparametric Classification Rules.- Nonparametric Regression Analysis of Some Economic Data.- V. Dependent Data.- Nonparametric Regression Methods for Repeated Measurements.- Nonparametric Prediction for Unbounded Almost Stationary Processes.- Monte Carlo and Turbulence.- Kernel Density Estimation Under a Locally Mixing Condition.- Nonparametric Estimation of Survival Functions Under Dependent Competing Risks.- Estimation of Transition Distribution Function and Its Quantiles in Markov Processes: Strong Consistency and Asymptotic Normality.- L1 Strong Consistency for Density Estimates in Dependent Samples.- VI. Time Series Analysis, Signal Detection.- Nonparametric Statistical Signal Detection Problems.- Functional Identification in Nonlinear Time Series.- Modelization, Nonparametric Estimation and Prediction for Continuous Time Processes.- Estimation of Chaotic Dynamic Systems with Control Variables.- Nonparametric Estimation of a Class of Nonlinear Time Series Models.- Semiparametric and Nonparametric Inference from Irregular Observations on Continuous Time Stochastic Processes.- VII. Various Topics.- Complexity Regularization with Application to Artificial Neural Networks.- Designing Prediction Bands.- Analysis of Observational Studies from the Point of View of Nonparametric Regression.- Some Issues in Cross-Validation.- Nonparametric Function Estimation Involving Errors-in-Variables.- VII. Various Topics (Continued).- A Consistent Goodness of Fit Test Based on the Total Variation Distance.- On a Problem in Semiparametric Estimation.- On the Integrable and Approximately Integrable Linear Statistical Models.- Nonparametric Techniques in Image Estimation.- Regularized Deconvolution on the Circle and the Sphere.- List of Attendants.- Contributed Papers.


Statistics & Probability Letters | 1989

Consistent regression estimation with fixed design points under dependence conditions

George G. Roussas

For n = 1, 2,... and i integer between 1 and n, let xni be fixed design points in a compact subset S of , and let Yni be observations taken at these points through g, an unknown continuous real-valued function defined on , and subject to errors [var epsilon]ni; that is, Yni = g(xni) + [var epsilon]ni. For any x in , g(x) is estimated by gn(x; xn) = [Sigma]ni = 1wni(x; xn)Yni, where xn = (xn1,...,xnn) and wni(·;·) are suitable weights. If the errors [var epsilon]ni are centered at their expectations, the proposed estimate is asymptotically unbiased. It is also consistent in quadratic mean and strongly consistent, if, in addition and for each n, the random variables [var epsilon]ni, i [greater-or-equal, slanted] 1, are coming from a strictly stationary sequence obeying any one of the four standard modes of mixing.


Statistics & Probability Letters | 2000

Asymptotic normality of the kernel estimate of a probability density function under association

George G. Roussas

The sole purpose of this paper is to establish asymptotic normality of the usual kernel estimate of the marginal probability density function of a strictly stationary sequence of associated random variables. In much of the discussions and derivations, the term association is used to include both positively and negatively associated random variables. The method of proof follows the familiar pattern for dependent situations of using large and small blocks. A result made available in the literature recently is instrumental in the derivations.


Statistics & Probability Letters | 1991

Kernel estimates under association: strong uniform consistency

George G. Roussas

Let X1, X2,... be associated random variables forming a strictly stationary sequence, and let f be the probability density function of X1. For r [greater-or-equal, slanted] 0 integer, let f(r) be the rth order derivative of f. Under suitable regularity conditions on a kernel function K, a sequence of bandwidths {hn}, the derivatives f(s), s = 0, 1,..., r, and the covariances Cov(X1, Xi), i [greater-or-equal, slanted] 2, the usual kernel estimate of f(r)(x) is shown to be strongly consistent, uniformly in x. An application is also presented in the estimation of the hazard rate. Finally, certain covergence rates are also discussed.


Statistics & Probability Letters | 1992

Uniform strong estimation under α-mixing, with rates☆

Zongwu Cai; George G. Roussas

Let s{;Xns};, n [greater-or-equal, slanted] 1, be a stationary [alpha]-mixing sequence of real-valued r.v.s with distribution function (d.f.) F, probability density function (p.d.f.) f and mixing coefficient [alpha](n). The d.f. F is estimated by the empirical d.f. Fn, based on the segment X1,..., Xn. By means of a mixingale argument, it is shown that Fn(x) converges almost surely to F(x) uniformly in x[set membership, variant]. An alternative approach, utilizing a Kiefer process approximation, establishes the law of the iterated logarithm for sups{;vb;Fn(x)-F(xvb;; x[set membership, variant]. The d.f. F is also estimated by a smooth estimate n, which is shown to converge almost surely (a.s.) to F, and the rate of convergence of sups{;vb;n(x) - F(x)vb;;; x[set membership, variant]s}; is of the order of O((log log n/n)). The p.d.f. f is estimated by the usual kernel estimate fn, which is shown to converge a.s. to f uniformly in x[set membership, variant], and the rate of this convergence is of the order of O((log log n/nh2n)), where hn is the bandwidth used in fn. As an application, the hazard rate r is estimated either by rn or n, depending on whether Fn or n is employed, and it is shown that rn(x) and n(x) converge a.s. to r(x), uniformly over certain compact subsets of , and the rate of convergence is again of the order of O((log log n/nh2n)). Finally, the rth order derivative of f, f(r), is estimated by f(r)n, and is shown that f(r)n(x) converges a.s. to f(r)(x) uniformly in x[set membership, variant].The rate of this convergence is of the order of O((log log n/nh2(r+1)n)).

Collaboration


Dive into the George G. Roussas's collaboration.

Top Co-Authors

Avatar

Madan L. Puri

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar

Marc Hallin

Université libre de Bruxelles

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lanh Tat Tran

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar

Michael G. Akritas

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bruce Lind

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Richard A. Johnson

University of Wisconsin-Madison

View shared research outputs
Researchain Logo
Decentralizing Knowledge