Zhenlin Yang
Singapore Management University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Zhenlin Yang.
Journal of Quality Technology | 2002
Zhenlin Yang; Min Xie; Vellaisamy Kuralmani; Kwok-Leung Tsui
The control chart based on the geometric distribution (geometric chart) has been shown to be competitive with p- or np-charts for monitoring the proportion nonconforming, especially for applications in high quality manufacturing environments. However, implementing a geometric chart is often based on the assumption that the in-control proportion nonconforming is known or accurately estimated. For a high quality process, an accurate parameter estimate may require a very large sample size that is seldom available. In this paper we investigate the sample size effect when the proportion nonconforming is estimated. An analytical approximation is derived to compute shift detection probabilities and run length distributions. It is found that the effect on the alarm probability can be significant even with sample sizes as large as 10,000. However, the average run length is only affected mildly unless the sample size is small and there is a large process improvement. In practice, the quantitative results of the paper can be used to determine the minimum number of items required for estimating the control limits of a geometric chart so that certain average run length requirements are met.
Journal of Statistical Computation and Simulation | 2003
Zhenlin Yang; Min Xie
The maximum likelihood estimator of the Weibull shape parameter can be very biased. An estimator based on the modified profile likelihood is proposed and its properties are studied. It is shown that the new estimator is almost unbiased with relative bias being less than 1% in most of situations, and it is much more efficient than the regular MLE. The smaller the sample or the heavier of the censoring, the more efficient is the new estimator relative to the regular MLE.
Computational Statistics & Data Analysis | 2006
Jun Yu; Zhenlin Yang; Xibin Zhang
A class of stochastic volatility (SV) models is proposed by applying the Box-Cox transformation to the volatility equation. This class of nonlinear SV (N-SV) models encompasses all standard SV models, including the well-known lognormal (LN) SV model. It allows to empirically compare and test all standard specifications in a very convenient way and provides a measure of the degree of departure from the classical models. A likelihood-based technique is developed for analyzing the model. Daily dollar/pound exchange rate data provide some evidence against LN model and strong evidence against all the other classical specifications. An efficient algorithm is proposed to study the economic importance of the proposed model on pricing currency options.
Journal of Product & Brand Management | 2004
Xiaolin Xing; Fang-Fang Tang; Zhenlin Yang
This paper investigates prices of consumer electronics sold on the Web by both online‐only retailers (Dotcoms) and the online branches of multi‐channel retailers (MCRs). Surprisingly, it finds that Dotcoms charge higher price than MCRs, a conclusion contradictory to the results of most of empirical studies. Also finds that the electronics prices decreased over the period of study in general, dropping about 0.6 percent per week, and the prices of MCRs and Dotcoms went down with time at a similar speed. Further, the prices across MCRs are 35.3 percent more dispersed than the prices across the Dotcoms based on full prices, and 33.1 percent more dispersed based on percentage prices. However, results show that price dispersion moved up with time in general, with no significant difference in the speeds between MCRs and Dotcoms.
Econometrics Journal | 2013
Badi H. Baltagi; Zhenlin Yang
The robustness of the LM tests for spatial error dependence of Burridge (1980) and Born and Breitung (2011) for the linear regression model, and Anselin (1988) and Debarsy and Ertur (2010) for the panel regression model with random or fixed effects are examined. While all tests are asymptotically robust against distributional misspecification, their finite sample behavior may be sensitive to the spatial layout. To overcome this shortcoming, standardized LM tests are suggested. Monte Carlo results show that the new tests possess good finite sample properties. An important observation made throughout this study is that the LM tests for spatial dependence need to be both mean- and variance-adjusted for good finite sample performance to be achieved. The former is, however, often neglected in the literature. Key Words: Bootstrap; Distributional misspecification; Group interaction; LM test; Moran’s I Test; Robustness; Spatial layout; Spatial panel models JEL No. C21, C23, C5
Quality and Reliability Engineering International | 2000
Min Xie; Zhenlin Yang; Olivier Gaudoin
When lifetimes follow Weibull distribution with known shape parameter, a simple power transformation could be used to transform the data to the case of exponential distribution, which is much easier to analyze. Usually, the shape parameter cannot be known exactly and it is important to investigate the effect of mis-specification of this parameter. In a recent article, it was suggested that the Weibull-to-exponential transformation approach should not be used as the confidence interval for the scale parameter has very poor statistical property. However, it would be of interest to study the use of Weibull-to-exponential transformation when the mean time to failure or reliability is to be estimated, which is a more common question. In this paper, the effect of mis-specification of Weibull shape parameters on these quantities is investigated. For reliability-related quantities such as mean time to failure, percentile lifetime and mission reliability, the Weibull-to-exponential transformation approach is generally acceptable. For the cases when the data are highly censored or when small tail probability is concerned, further studies are needed, but these are known to be difficult statistical problems for which there are no standard solutions. Copyright
Journal of Applied Statistics | 2000
Zhenlin Yang; Min Xie
Many process characteristics follow an exponential distribution, and control charts based on such a distribution have attracted a lot of attention. However, traditional control limits may be not appropriate because of the lack of symmetry. In this paper, process monitoring through a normalizing power transformation is studied. The traditional individual measurement control charts can be used based on the transformed data. The properties of this control chart are investigated. A comparison with the chart when using probability limits is also carried out for cases of known and estimated parameters. Without losing much accuracy, even compared with the exact probability limits, the power transformation approach can easily be used to produce charts that can be interpreted when the normality assumption is valid.
Social Choice and Welfare | 2006
Winston T. H. Koh; Zhenlin Yang; Lijing Zhu
This paper investigates the allocative efficiency of two non-price allocation mechanisms – the lottery (random allocation) and the waiting-line auction (queue system) – for the cases where consumers possess identical time costs (the homogeneous case), and where time costs are correlated with time valuations (the heterogeneous case). We show that the relative efficiency of the two mechanisms depends critically on a scarcity factor (measured by the ratio of the number of objects available for allocation over the number of participants) and on the shape of the distribution of valuations. We show that the lottery dominates the waiting-line auction for a wide range of situations, and that while consumer heterogeneity may improve the relative allocative efficiency of the waiting-line auction, the ranking on relative efficiency is not reversed.
Journal of Statistical Computation and Simulation | 2007
Zhenlin Yang; Min Xie; Augustine Wong
Statistical inference methods for the Weibull parameters and their functions usually depend on extensive tables, and hence are rather inconvenient for the practical applications. In this paper, we propose a general method for constructing confidence intervals for the Weibull parameters and their functions, which eliminates the need for the extensive tables. The method is applied to obtain confidence intervals for the scale parameter, the mean-time-to-failure, the percentile function, and the reliability function. Monte-Carlo simulation shows that these intervals possess excellent finite sample properties, having coverage probabilities very close to their nominal levels, irrespective of the sample size and the degree of censorship.
Lifetime Data Analysis | 1999
Zhenlin Yang
In predicting a future lifetime based on a sample of past lifetimes, the Box-Cox transformation method provides a simple and unified procedure that is shown in this article to meet or often outperform the corresponding frequentist solution in terms of coverage probability and average length of prediction intervals. Kullback-Leibler information and second-order asymptotic expansion are used to justify the Box-Cox procedure. Extensive Monte Carlo simulations are also performed to evaluate the small sample behavior of the procedure. Certain popular lifetime distributions, such as Weibull, inverse Gaussian and Birnbaum-Saunders are served as illustrative examples. One important advantage of the Box-Cox procedure lies in its easy extension to linear model predictions where the exact frequentist solutions are often not available.