Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nicoleta Serban is active.

Publication


Featured researches published by Nicoleta Serban.


Journal of the American Statistical Association | 2005

CATS : Clustering after transformation and smoothing

Nicoleta Serban; Larry Wasserman

CATS-clustering after transformation and smoothing-is a technique for nonparametrically estimating and clustering a large number of curves. Our motivating example is a genetic microarray experiment, but the method is very general. The method includes transformation and smoothing multiple curves, multiple nonparametric testing for screening out flat curves, clustering curves with similar shape, and nonparametrically inferring the clustering estimation error rate.CATS—clustering after transformation and smoothing—is a technique for nonparametrically estimating and clustering a large number of curves. Our motivating example is a genetic microarray experiment, but the method is very general. The method includes transformation and smoothing multiple curves, multiple nonparametric testing for screening out flat curves, clustering curves with similar shape, and nonparametrically inferring the clustering estimation error rate.


Ophthalmology | 2013

Evaluation of Telemedicine for Screening of Diabetic Retinopathy in the Veterans Health Administration

Eser Kirkizlar; Nicoleta Serban; Jennifer A. Sisson; Julie L. Swann; Claire S. Barnes; Michael D. Williams

OBJECTIVE To explore the cost-effectiveness of telemedicine for the screening of diabetic retinopathy (DR) and identify changes within the demographics of a patient population after telemedicine implementation. DESIGN A retrospective medical chart review (cohort study) was conducted. PARTICIPANTS A total of 900 type 1 and type 2 diabetic patients enrolled in a medical system with a telemedicine screening program for DR. METHODS The cost-effectiveness of the DR telemedicine program was determined by using a finite-horizon, discrete time, discounted Markov decision process model populated by parameters and testing frequency obtained from patient records. The model estimated the progression of DR and determined average quality-adjusted life years (QALYs) saved and average additional cost incurred by the telemedicine screening program. MAIN OUTCOME MEASURES Diabetic retinopathy, macular edema, blindness, and associated QALYs. RESULTS The results indicate that telemedicine screening is cost-effective for DR under most conditions. On average, it is cost-effective for patient populations of >3500, patients aged <80 years, and all racial groups. Observable trends were identified in the screening population since the implementation of telemedicine screening: the number of known DR cases has increased, the overall age of patients receiving screenings has decreased, the percentage of nonwhites receiving screenings has increased, the average number of miles traveled by a patient to receive a screening has decreased, and the teleretinal screening participation is increasing. CONCLUSIONS The current teleretinal screening program is effective in terms of being cost-effective and increasing population reach. Future screening policies should give consideration to the age of patients receiving screenings and the systems patient pool size because our results indicate it is not cost-effective to screen patients aged older than 80 years or in populations with <3500 patients.


The Annals of Applied Statistics | 2011

Degradation modeling applied to residual lifetime prediction using functional data analysis

Rensheng R. Zhou; Nicoleta Serban; Nagi Gebraeel

Sensor-based degradation signals measure the accumulation of damage of an engineering system using sensor technology. Degradation signals can be used to estimate, for example, the distribution of the remaining life of partially degraded systems and/or their components. In this paper we present a nonparametric degradation modeling framework for making inference on the evolution of degradation signals that are observed sparsely or over short intervals of times. Furthermore, an empirical Bayes approach is used to update the stochastic parameters of the degradation model in real-time using training degradation signals for online monitoring of components operating in the field. The primary application of this Bayesian framework is updating the residual lifetime up to a degradation threshold of partially degraded components. We validate our degradation modeling approach using a real-world crack growth data set as well as a case study of simulated degradation signals.


Biometrics | 2013

Multilevel Cross-Dependent Binary Longitudinal Data

Nicoleta Serban; Ana-Maria Staicu; Raymond J. Carroll

We provide insights into new methodology for the analysis of multilevel binary data observed longitudinally, when the repeated longitudinal measurements are correlated. The proposed model is logistic functional regression conditioned on three latent processes describing the within- and between-variability, and describing the cross-dependence of the repeated longitudinal measurements. We estimate the model components without employing mixed-effects modeling but assuming an approximation to the logistic link function. The primary objectives of this article are to highlight the challenges in the estimation of the model components, to compare two approximations to the logistic regression function, linear and exponential, and to discuss their advantages and limitations. The linear approximation is computationally efficient whereas the exponential approximation applies for rare events functional data. Our methods are inspired by and applied to a scientific experiment on spectral backscatter from long range infrared light detection and ranging (LIDAR) data. The models are general and relevant to many new binary functional data sets, with or without dependence between repeated functional measurements.


Journal of the American Medical Informatics Association | 2015

Understanding variations in pediatric asthma care processes in the emergency department using visual analytics

Rahul C. Basole; Mark L. Braunstein; Vikas Kumar; Hyunwoo Park; Minsuk Kahng; Duen Horng Chau; Acar Tamersoy; Daniel A. Hirsh; Nicoleta Serban; James Bost; Burton Lesnick; Beth L. Schissel; Michael Thompson

Health care delivery processes consist of complex activity sequences spanning organizational, spatial, and temporal boundaries. Care is human-directed so these processes can have wide variations in cost, quality, and outcome making systemic care process analysis, conformance testing, and improvement challenging. We designed and developed an interactive visual analytic process exploration and discovery tool and used it to explore clinical data from 5784 pediatric asthma emergency department patients.


Technometrics | 2012

Clustering Random Curves Under Spatial Interdependence With Application to Service Accessibility

Huijing Jiang; Nicoleta Serban

Service accessibility is defined as the access of a community to the nearby site locations in a service network consisting of multiple geographically distributed service sites. Leveraging new statistical methods, this article estimates and classifies service accessibility patterns varying over a large geographic area (Georgia) and over a period of 16 years. The focus of this study is on financial services but it generally applies to any other service operation. To this end, we introduce a model-based method for clustering random time-varying functions that are spatially interdependent. The underlying clustering model is nonparametric with spatially correlated errors. We also assume that the clustering membership is a realization from a Markov random field. Under these model assumptions, we borrow information across functions corresponding to nearby spatial locations resulting in enhanced estimation accuracy of the cluster effects and of the cluster membership as shown in a simulation study. Supplementary materials including the estimation algorithm, additional maps of the data, and the C++ computer programs for analyzing the data in our case study are available online.


The Annals of Applied Statistics | 2011

A space–time varying coefficient model: The equity of service accessibility

Nicoleta Serban

Research in examining the equity of service accessibility has emerged as economic and social equity advocates recognized that where people live influences their opportunities for economic development, access to quality health care and political participation. In this research paper service accessibility equity is concerned with where and when services have been and are accessed by different groups of people, identified by location or underlying socioeconomic variables. Using new statistical methods for modeling spatial-temporal data, this paper estimates demographic association patterns to financial service accessibility varying over a large geographic area (Georgia) and over a period of 13 years. The underlying model is a space--time varying coefficient model including both separable space and time varying coefficients and space--time interaction terms. The model is extended to a multilevel response where the varying coefficients account for both the within- and between-variability. We introduce an inference procedure for assessing the shape of the varying regression coefficients using confidence bands.


Computational Statistics & Data Analysis | 2007

Discovery, visualization and performance analysis of enterprise workflow

Ping Zhang; Nicoleta Serban

This work was motivated by a recent experience where we needed to develop enterprise operational reports when the underlying business process is not entirely known, a common situation for large companies with sophisticated IT systems. We learned that instead of relying on human knowledge or business documentation, it is much more reliable to learn from the flow structure of event sequences recorded for work items. An example of work items are product alarms detected and reported to a technical center through a remote monitoring system; the corresponding event sequence of a work item is an alarm history, i.e. the alarm handling process. We call the flow of event sequences recorded for work items, workflow. In this paper, we developed an algorithm to discover and visualize workflows for data from a remote technical support center, and argue that workflow discovery is a prerequisite for rigorous performance analysis. We also carried out a detailed performance analysis based on the discovered workflow. Among other things, we find that service time (e.g. the time necessary for handling a product alarm) fits the profile of a log-mixture distribution. It takes at least two parameters to describe such a distribution, which leads to the proposed method of using two metrics for service time reporting.


Iie Transactions | 2012

Degradation modeling and monitoring of truncated degradation signals

Rensheng Zhou; Nagi Gebraeel; Nicoleta Serban

Advancements in condition monitoring techniques have facilitated the utilization of sensor technology for predicting failures of engineering systems. Within this context, failure is defined as the point where a sensor-based degradation signal reaches a pre-specified failure threshold. Parametric degradation models rely on complete signals to estimate the parametric functional form and do not perform well with sparse historical data. On the other hand, non-parametric models that address the challenges of data sparsity usually assume that signal observations can be made beyond the failure threshold. Unfortunately, in most applications, degradation signals can only be observed up to the failure threshold resulting in what this article refers to as truncated degradation signals. This article combines a non-parametric degradation modeling framework with a signal transformation procedure, allowing different types of truncated degradation signals to be characterized. This article considers (i) complete signals that result from constant monitoring of a system up to its failure; (ii) sparse signals resulting from sparse observations; and (iii) fragmented signals that result from dense observations over disjoint time intervals. The goal is to estimate and update the residual life distributions of partially degraded systems using in situ signal observations. It is showed that the proposed model outperforms existing models for all three signal types.


Technometrics | 2014

A Functional Time Warping Approach to Modeling and Monitoring Truncated Degradation Signals

Rensheng R. Zhou; Nicoleta Serban; Nagi Gebraeel; Hans-Georg Müller

Degradation signals are sensor-based signals that are correlated with degradation processes of engineering components. In this article, we present a flexible modeling framework for characterizing degradation signals that can only be observed up to a prespecified failure threshold. The underlying assumption of this framework is that the engineering components degrade according to a similar trend, referred to as the common shape function, but at different degradation rates. Under this assumption, the degradation signals of different components are synchronized using a random time warping process that transforms the common trend function into degradation processes that progress at varying rates. Our primary objective is to obtain real-time predictions for the residual lifetime of components deployed in the field. In the initial step, the historical degradation signals are used to recover the distribution of the degradation processes under the assumptions of the proposed time warping model. Next, the distribution of the degradation process is updated using the signal(s) of partially degraded component(s). The updated model is then used to predict the residual lifetime distributions of these components. We test the performance of our methodology using vibration-based degradation signals from a rotating machinery experiment and simulated degradation signals. Additional information and codes are available as supplementary material online.

Collaboration


Dive into the Nicoleta Serban's collaboration.

Top Co-Authors

Avatar

Julie L. Swann

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

William B. Rouse

Stevens Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nagi Gebraeel

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Paul M. Griffin

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yuchen Zheng

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Clark Glymour

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Larry Wasserman

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge