John L. Eltinge
Bureau of Labor Statistics
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by John L. Eltinge.
Journal of the American Statistical Association | 2011
Daniell Toth; John L. Eltinge
In the past several years a wide range of methods for the construction of regression trees and other estimators based on the recursive partitioning of samples have appeared in the statistics literature. Many applications involve data collected through a complex sample design. At present, however, relatively little is known regarding the properties of these methods under complex designs. This article proposes a method for incorporating information about the complex sample design when building a regression tree using a recursive partitioning algorithm. Sufficient conditions are established for asymptotic design L2 consistency of these regression trees as estimators for an arbitrary regression function. The proposed method is illustrated with Occupational Employment Statistics establishment survey data linked to Quarterly Census of Employment and Wage payroll data of the Bureau of Labor Statistics. Performance of the nonparametric estimator is investigated through a simulation study based on this example.
Journal of Official Statistics | 2013
John L. Eltinge; Paul Biemer; Anders Holmberg
Abstract This article outlines a framework for formal description, justification and evaluation in development of architectures for large-scale statistical production systems. Following an introduction of the main components of the framework, we consider four related issues: (1) Use of some simple schematic models for survey quality, cost, risk, and stakeholder utility to outline several groups of questions that may inform decisions on system design and architecture. (2) Integration of system architecture with models for total survey quality (TSQ) and adaptive total design (ATD). (3) Possible use of concepts from the Generic Statistical Business Process Model (GSBPM) and the Generic Statistical Information Model (GSIM). (4) The role of governance processes in the practical implementation of these ideas.
Journal of Official Statistics | 2013
Boris Lorenc; Paul Biemer; Ingegerd Jansson; John L. Eltinge; Anders Holmberg
Many survey professionals have become aware of an increasing dependence among statistical methods, statistical practice as implemented in tools and procedures, and information technology (IT). This dependence is reflected in a shift towards standardized processes and tools, integrated systems for production of statistics, statistical modernization programmes, transitions from stove-pipe based to systems-based statistical production, and similar activities. These developments have arisen within statistical organizations and through cross-organizational collaboration. In developing this special issue, we identified two trends – one in methodology and one in IT systems – that were influential in these developments. The methodological trend leads towards a balanced evaluation and treatment of the components of aggregate error associated with a statistical product. The result, sometimes called Total Survey Design or, more generally, Adaptive Total Design (ATD), is also well integrated with the quality perspectives on statistics production. The IT system trend centers on developments in “enterprise architecture,” which very broadly is a formal, structured description of an enterprise. These developments are also influenced by increasing financial pressure experienced by large-scale statistical organizations. Twenty years ago, introducing a special issue of JOS (1993, Vol.9:1) on “Current Research and Policy Issues at Statistical Agencies and Organizations,” the then editor-inchief Lars Lyberg noted that statistical agencies are undergoing “a perennial flow of the flux and change”, and followed with a list of areas affected by new developments. The list included, among other items, user needs and the external survey environment, development of new programs, interagency harmonization, new technologies, and confidentiality protection. All of these were said to be influenced by increasing pressure for productivity and quality gains. Most of these issues remain with us today in one form or another. Within the context defined by these evergreen topics, there has been substantial progress in both of the abovementioned trends. Embracing the enterprise architecture approach, recent years have seen an active international collaboration regarding the foundations of production of statistics. Common conceptual frameworks, like the Generic Statistical Business Process Model (GSBPM) and Generic Statistical Information Model (GSIM), are the results of such international initiatives. In addition, there is a wide range of
Journal of Official Statistics | 2014
MoonJung Cho; John L. Eltinge; Julie Gershunskaya; Larry Huff
Abstract Two sets of diagnostics are presented to evaluate the properties of generalized variance functions (GVFs) for a given sample survey. The first set uses test statistics for the coefficients of multiple regression forms of GVF models. The second set uses smoothed estimators of the mean squared error (MSE) of GVF-based variance estimators. The smooth version of the MSE estimator can provide a useful measure of the performance of a GVF estimator, relative to the variance of a standard design-based variance estimator. Some of the proposed methods are applied to sample data from the Current Employment Statistics survey.
Journal of Official Statistics | 2014
MoonJung Cho; John L. Eltinge; Julie Gershunskaya; Larry Huff
Abstract Large-scale establishment surveys often exhibit substantial temporal or cross-sectional variability in their published standard errors. This article uses a framework defined by survey generalized variance functions to develop three sets of analytic tools for the evaluation of these patterns of variability. These tools are for (1) identification of predictor variables that explain some of the observed temporal and cross-sectional variability in published standard errors; (2) evaluation of the proportion of variability attributable to the abovementioned predictors, equation error and estimation error, respectively; and (3) comparison of equation error variances across groups defined by observable predictor variables. The primary ideas are motivated and illustrated by an application to the U.S. Current Employment Statistics program.
Archive | 2003
David Swanson; Moon Jung Cho; John L. Eltinge
Archive | 2010
John L. Eltinge
Archive | 2007
MoonJung Cho; Te-Ching Chen; Patrick A. Bobbitt; James A. Himelein; Steve P. Paben; Lawrence R. Ernst; John L. Eltinge
Archive | 2013
John L. Eltinge
Statistica Sinica | 2018
Wei-Yin Loh; John L. Eltinge; Moon Jung Cho; Yuanzhi Li