Jye-Chyi Lu
Georgia Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jye-Chyi Lu.
Technometrics | 1997
Jye-Chyi Lu; Jinho Park; Qing Yang
In the study of semiconductor degradation, records of transconductance loss or threshold voltage shift over time are useful in constructing the cumulative distribution function (cdf) of the time until the degradation reaches a specified level. In this article, we propose a model with random regression coefficients and a standard-deviation function for analyzing linear degradation data. Both analytical and empirical motivations of the model are provided. We estimate the model parameters, the cdf, and its quantiles by the maximum likelihood (ML) method and construct confidence intervals from the bootstrap, from the asymptotic normal approximation, and from inverting likelihood ratio tests. Simulations are conducted to examine the properties of the ML estimates and the confidence intervals. Analysis of an engineering dataset illustrates the proposed procedures.
Scientometrics | 2002
Alan L. Porter; Alisa Kongthon; Jye-Chyi Lu
We propose enhancing the traditional literature review through “research profiling”. This broad scan of contextual literature can extend the span of science by better linking efforts across research domains. Topical relationships, research trends, and complementary capabilities can be discovered, thereby facilitating research projects. Modern search engine and text mining tools enable research profiling by exploiting the wealth of accessible information in electronic abstract databases such as MEDLINE and Science Citation Index. We illustrate the potential by showing sixteen ways that “research profiling” can augment a traditional literature review on the topic of data mining.
International Journal of Production Research | 2006
Myong K. Jeong; Jye-Chyi Lu; N. Wang
Functional data characterize the quality or reliability performance of many manufacturing processes. As can be seen in the literature, such data are informative in process monitoring and control for nanomachining, for ultra-thin semiconductor fabrication, and for antenna, steel-stamping, or chemical manufacturing processes. Many functional data in manufacturing applications show complicated transient patterns such as peaks representing important process characteristics. Wavelet transforms are popular in the computing and engineering fields for handling these types of complicated functional data. This article develops a wavelet-based statistical process control (SPC) procedure for detecting ‘out-of-control’ events that signal process abnormalities. Simulation-based evaluations of average run length indicate that our new procedure performs better than extensions from well-known methods in the literature. More importantly, unlike recent SPC research on linear profile data for monitoring global changes of data patterns, our methods focus on local changes in data segments. In contrast to most of the SPC procedures developed for detecting a known type of process change, our idea of updating the selected parameters adaptively can handle many types of process changes whether known or unknown. Finally, due to the data-reduction efficiency of wavelet thresholding, our procedure can deal effectively with large data sets.
European Journal of Operational Research | 2012
Jye-Chyi Lu; Faiz A. Al-Khayyal; Yu-Chung Tsao
Managing shelf space is critical for retailers to attract customers and optimize profits. This article develops a shelf-space allocation optimization model that explicitly incorporates essential in-store costs and considers space- and cross-elasticities. A piecewise linearization technique is used to approximate the complicated nonlinear space-allocation model. The approximation reformulates the non-convex optimization problem into a linear mixed integer programming (MIP) problem. The MIP solution not only generates near-optimal solutions for large scale optimization problems, but also provides an error bound to evaluate the solution quality. Consequently, the proposed approach can solve single category-shelf space management problems with as many products as are typically encountered in practice and with more complicated cost and profit structures than currently possible by existing methods. Numerical experiments show the competitive accuracy of the proposed method compared with the mixed integer nonlinear programming shelf-space model. Several extensions of the main model are discussed to illustrate the flexibility of the proposed methodology.
IEEE Transactions on Components, Packaging, and Manufacturing Technology: Part C | 1997
Martha M. Gardner; Jye-Chyi Lu; J. J. Wortman; Brian E. Hornung; Holger H. Heinisch; Eric A. Rying; Suraj Rao; Joseph C. Davis; Purnendu K. Mozumder
This paper describes a new methodology for equipment fault detection. The key features of this methodology are that it allows for the incorporation of spatial information and that it can be used to detect and diagnose equipment faults simultaneously. This methodology consists of constructing a virtual wafer surface from spatial data and using physically based spatial signature metrics to compare the virtual wafer surface to an established baseline process surface in order to detect equipment faults. Statistical distributional studies of the spatial signature metrics provide the justification of determining the significance of the spatial signature. Data collected from a rapid thermal chemical vapor deposition (RTCVD) process and from a plasma enhanced chemical vapor deposition (PECVD) process are used to illustrate the procedures. This method detected equipment faults for all 11 wafers that were subjected to induced equipment faults in the RTCVD process, and even diagnosed the type of equipment fault for 10 of these wafers. This method also detected 42 of 44 induced equipment faults in the PECVD process.
Technometrics | 2006
Myong-Kee Jeong; Jye-Chyi Lu; Xiaoming Huo; Brani Vidakovic; Di Chen
This article presents new data reduction methods based on the discrete wavelet transform to handle potentially large and complicated nonstationary data curves. The methods minimize objective functions to balance the trade-off between data reduction and modeling accuracy. Theoretic investigations provide the optimality of the methods and the large-sample distribution of a closed-form estimate of the thresholding parameter. An upper bound of errors in signal approximation (or estimation) is derived. Based on evaluation studies with popular testing curves and real-life datasets, the proposed methods demonstrate their competitiveness with the existing engineering data compression and statistical data denoising methods for achieving the data reduction goals. Further experimentation with a tree-based classification procedure for identifying process fault classes illustrates the potential of the data reduction tools. Extension of the engineering scalogram to the reduced-size semiconductor fabrication data leads to a visualization tool for monitoring and understanding process problems.
IEEE Transactions on Reliability | 1989
Jye-Chyi Lu
The physical motivation for the bivariate extensions of the exponential distribution due to Freund (1961) and Marshall-Olkin (1957) is common in engineering applications. The author extends their models to the case where the failure rate of one component changes upon the failure of the other and Poisson fatal shocks cause simultaneous failures of both components in order to derive bivariate extensions of the Weibull distribution. Some special cases, such as bivariate extensions of the linear hazard rate and minimum type distributions, are discussed. >
IEEE Transactions on Neural Networks | 2002
Eric A. Rying; Griff L. Bilbro; Jye-Chyi Lu
A novel objective function is presented that incorporates both local and global errors as well as model parsimony in the construction of wavelet neural networks. Two methods are presented to assist in the minimization of this objective function, especially the local error term. First, during network initialization, a locally adaptive grid is utilized to include candidate wavelet basis functions whose local support addresses the local error of the local feature set. This set can be either user-defined or determined using information derived from the wavelet transform modulus maxima representation. Next, during the network construction, a new selection procedure based on a subspace projection operator is presented to help focus the selection of wavelet basis functions to reduce the local error. Simulation results demonstrate the effectiveness of these methodologies in minimizing local and global error while maintaining model parsimony and incurring a minimal increase on computational complexity.
European Journal of Operational Research | 2012
Yu-Chung Tsao; Divya Mangotra; Jye-Chyi Lu; Ming Dong
In today’s retail business many companies have a complex distribution network with several national and regional distribution centers. This article studies an integrated facility location and inventory allocation problem for designing a distribution network with multiple distribution centers and retailers. The key decisions are where to locate the regional distribution centers (RDCs), how to assign retail stores to RDCs and what should be the inventory policy at the different locations such that the total network cost is minimized. Due to the complexity of the problem, a continuous approximation (CA) model is used to represent the network. Nonlinear programming techniques are developed to solve the optimization problems. The main contribution of this work lies in developing a new CA modeling technique when the discrete data cannot be modeled by a continuous function and applying this technique to solve an integrated facility location-allocation and inventory-management problem. Our methodology is illustrated with the network from a leading US retailer. Numerical analysis suggests that the total cost is significantly lower in the case of the integrated model as compared with the non-integrated model, where the location-allocation and inventory-management problems are considered separately. This paper also studies the effects of changing parameter values on the optimal solutions and to point out some management implications.
IEEE Transactions on Reliability | 2006
Ni Wang; Jye-Chyi Lu; Paul H. Kvam
This article proposes methods for modeling service reliability in a supply chain. The logistics system in a supply chain typically consists of thousands of retail stores along with multiple distribution centers (DC). Products are transported between DC & stores through multiple routes. The service reliability depends on DC location layouts, distances from DC to stores, time requirements for product replenishing at stores, DCs capability for supporting store demands, and the connectivity of transportation routes. Contingent events such as labor disputes, bad weather, road conditions, traffic situations, and even terrorist threats can have great impacts on a systems reliability. Given the large number of store locations & multiple combinations of routing schemes, this article applies an approximation technique for developing first-cut reliability analysis models. The approximation relies on multi-level spatial models to characterize patterns of store locations & demands. These models support several types of reliability evaluation of the logistics system under different probability scenarios & contingency situations. Examples with data taken from a large-scale logistics system of an automobile company illustrate the importance of studying supply-chain system reliability