Bartholomew P. K. Leung
Hong Kong Polytechnic University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bartholomew P. K. Leung.
Journal of Quality in Maintenance Engineering | 2006
W.K. Yeung; Andrew K. S. Jardine; Bartholomew P. K. Leung
Purpose – This paper aims to discuss and bring to the attention of researchers and practitioners the data management issues relating to condition‐based maintenance (CBM) optimization.Design/methodology/approach – The common data quality problems encountered in CBM decision analyses are investigated with a view to suggesting methods to resolve these problems. In particular, the approaches for handling missing data in the decision analysis are reviewed.Findings – This paper proposes a data structure for managing the asset‐related maintenance data that support CBM decision analysis. It also presents a procedure for data‐driven CBM optimization comprising the steps of data preparation, model construction and validation, decision‐making, and sensitivity analysis.Practical implications – Analysis of condition monitoring data using the proportional hazards modeling (PHM) approach has been proved to be successful in optimizing CBM decisions relating to motor transmission equipment, power transformers and manufact...
Iie Transactions | 2002
Bartholomew P. K. Leung; Fred A. Spiring
In this paper, a general class of loss functions based on the inversion of the standard beta probability density function (pdf) is examined. The extension of this loss function from a standard beta pdf ranging from (0, 1) to the general beta pdf ranging from (p, q) is examined through the scale invariance property under a linear transformation. An industrial application in quality assurance is used to demonstrate this general class of loss functions. Mathematical derivations are attached in the Appendices.
Quality Technology and Quantitative Management | 2004
Bartholomew P. K. Leung; Fred A. Spiring
Abstract The concept of inverting a normal probability density function in order to provide practitioners with realistic loss functions was introduced by Spiring [2]. Further developments saw the inversion of other density functions in an attempt to provide a variety of loss functions that could be used in depicting losses associated with deviations from a target (Spiring and Yeung [3]), Leung and Spiring [1]). The recent focus has been on the development and application of particular loss functions and their associated Risk functions. In this manuscript several properties associated with the entire family of Inverted Probability Loss Functions (IPLF) are investigated and outlined. As well, several IPLFs which possess interesting and unique properties associated with assessing, and depicting losses and loss functions are discussed. Several IPLFs will be considered, some plausible conjugate distributions presented and the general performance compared numerically under homogeneous conditions. Industrial examples demonstrating economic and monetary losses are included.
International Journal of Nursing Studies | 2012
Meyrick Chow; Shu-Man Kwok; Hing-Wah Luk; Jenny W.H. Law; Bartholomew P. K. Leung
BACKGROUND Both continuous and intermittent aspiration of subglottic secretions by means of specially designed endotracheal tubes containing a separate dorsal lumen that opens into the subglottic region have been shown to be useful in reducing ventilator-associated pneumonia (VAP). However, the high cost of these tubes restricts their use. OBJECTIVE The aim of this pilot randomized controlled trial was to test the effect of a low-cost device (saliva ejector) for continuous oral suctioning (COS) on the incidence of VAP in patients receiving mechanical ventilation. METHODS The study was conducted in the six-bed medical-surgical ICU of a hospital with over 400 beds that provides comprehensive medical services to the public. The design of this study was a parallel-group randomized controlled trial. While both the experimental and control groups used the conventional endotracheal tube, the saliva ejector was only applied to patients assigned to the experimental group. The device was put between the patients cheek and teeth, and then connected to 100mmHg of suction for the continuous drainage of saliva. RESULTS Fourteen patients were randomized to receive COS and 13 patients were randomized to the control group. The two groups were similar in demographics, reasons for intubation, co-morbidity, and risk factors for acquiring VAP. VAP was found in 3 patients (23.1%; 71 episodes of VAP per 1000 ventilation days) receiving COS and in 10 patients (83.3%; 141 episodes of VAP per 1000 ventilation days) in the control group (relative risk, 0.28; 95% confidence interval, 0.10-0.77; p=0.003). The duration of mechanical ventilation in the experimental group was 3.2 days (SD 1.3), while that in the control group was 5.9 days (SD 2.8) (p=0.009); and the length of ICU stay was 4.8 days (SD 1.6) versus 9.8 days (SD 6.3) for the experimental and control groups, respectively (p=0.019). CONCLUSION Continuous clearance of oral secretion by the saliva ejector may have an important role to play in reducing the rate of VAP, decreasing the duration of mechanical ventilation, and shortening the length of stay of patients in the ICU.
International Journal of Production Research | 2006
C. K. M. Lee; Henry C. W. Lau; Bartholomew P. K. Leung; G.T.S. Ho; K.L. Choy
In todays increasingly competitive environment, firms are experiencing growing pressure to reduce the product development lead time to meet market expectations. In this paper, a Responsive Product Development System (RPDS), which is used to model the product development process and the components of the process with object technology, introduces a dynamic product information schema characterized by its ability to provide design practitioners with a product data exchange standard, thus transforming data to information and then knowledge. To validate the feasibility of the proposed schema, a case study has been conducted in a plastic product factory based on the suggested approach. Following feedback from these companies, a further review of the design of the proposed system has been conducted to ensure efficient information flow across the heterogeneous computing environment. A measure of loss to society associated with RPDS in product development time is also included as a system evaluation.
Computational Statistics & Data Analysis | 2010
Ying Chen; Chi Kin Chan; Bartholomew P. K. Leung
Although three-level factorial designs with quantitative factors are not the most efficient way to fit a second-order polynomial model, they often find some application, when the factors are qualitative. The three-level orthogonal designs with qualitative factors are frequently used, e.g., in agriculture, in clinical trials and in parameter designs. It is practically unavoidable that, because of the limitation of experimental materials or time-related constraint, we often have to keep the number of experiments as small as possible and to consider the effects of many factors and interactions simultaneously so that most of such designs are saturated or nearly saturated. An experimental design is said to be saturated, if all degrees of freedom are consumed by the estimation of parameters in modelling mean response. The difficulty of analyzing orthogonal saturated designs is that there are no degrees of freedom left to estimate the error variance so that the ordinary ANOVA is no longer available. In this paper, we present a new formal test, which is based on mean squares, for analyzing three-level orthogonal saturated designs. This proposed method is compared via simulation with several mean squares based methods published in the literature. The results show that the new method is more powerful in terms of empirical power of the test. Critical values used in the proposed procedure for some three-level saturated designs are tabulated. Industrial examples are also included for illustrations.
systems man and cybernetics | 2009
Henry C. W. Lau; Cassandra X. H. Tang; Bartholomew P. K. Leung; C. K. M. Lee; George T. S. Ho
Reactive ion etching (RIE) is a process in the fabrication of semiconductor devices. The ability to predict the influence of the process parameters of RIE is crucial in terms of machine performance as they may have a serious impact on product quality as well as on the probability of machine failure. To address this issue, this correspondence paper presents a novel performance tradeoff function for evaluating the overall suitability of adopting the predicted control parameters suggested by domain experts, taking into full consideration their impact on the performance of the machine involved. An experiment using the RIE machine is provided to validate the practicability of the proposed approach.
Quality Technology and Quantitative Management | 2006
Smiley W. Cheng; Bartholomew P. K. Leung; Fred A. Spiring
Abstract Research efforts in the area of process capability have largely been devoted to finding a better process capability index (PCI) and to a lesser extent on the stochastic behavior of the estimated PCIs [1], [2]. Much of this development has gone unused for many reasons including a) a plethora of indices, b) interpretation, c) software support, d) standards and e) dissemination. The addition of the indices appears to have had little impact on the practitioners. Cp and Cpk (including Cpl and Cpu) [3] continue to be the most heavily used indices with Cpm [4] and Cpmk [5] occurring occasionally. The addition of stochastic assessments for estimated PCIs is a positive development, however statistical developments have frequently lacked background knowledge and implementation ease, hindering use by practitioners. A case study from the printing industry will be used to draw attention to areas impacting the practical use of PCIs. Concepts including a) establishing effective tolerance limits, b) the ongoing assessment and interpretation of PCIs and c) ongoing improvement will be presented. We will use the case to a) illustrate a strategy followed by practitioners using PCIs as a quality management tool, b) to draw attention to gaps that exist in the practical use of PCIs, c) to illustrate how some of these difficulties were overcome and d) to highlight research areas in the practical use of PCIs. A variety of quality tools including flowcharts/process documentation, control charts, process capability indices and experimental design are illustrated throughout the manuscript. Data used in the analyses has been included where permitted. Although drawn from the printing industry, the tools used, lessons learned and generic achievements are applicable to the wider area of product design and manufacturing, particularly where customers have unique requirements for a common product.
Computational Statistics & Data Analysis | 2013
Heung Wong; Riquan Zhang; Bartholomew P. K. Leung; Zhensheng Huang
The varying-coefficient single-index models (VCSIMs) form a class of very flexible and general dimension reduction models,?which contain many important regression models such as partially linear models,?pure single-index models,?partially linear single-index models,?varying-coefficient models and so on as special examples.?However,?the testing problems of the index parameter of the VCSIM have not been very well developed,?due partially to the complexity of the models.?To this end,?based on the estimators obtained by the local linear method and the backfitting technique,?we propose the generalized F-type test method to deal with the testing problems of the index parameters of the VCSIM.?It is shown that under the null hypothesis the proposed test statistic follows asymptotically a ? 2 -distribution with the scale constant and the degrees of freedom being independent of the nuisance parameters or functions, which is called the Wilks phenomenon.?Simulated data and real data examples are used to illustrate our proposed methodology.
Communications in Statistics - Simulation and Computation | 2010
Bartholomew P. K. Leung; Heung Wong; Riquan Zhang; Shengtong Han
In this article, the partially linear single-index models are discussed based on smoothing spline and average derivative estimation method. This proposed technique consists of two stages: one is to estimate the vector parameter in the linear part using the smoothing cubic spline method, simultaneously, obtaining the estimator of unknown single-index function; the other is to estimate the single-index coefficients in the single-index part by the using average derivative estimator procedure. Some simulated and real examples are presented to illustrate the performance of this method.