Lawrence W. Lan
MingDao University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lawrence W. Lan.
Computers & Industrial Engineering | 2009
Ming-Lang Tseng; Jui Hsiang Chiang; Lawrence W. Lan
Selection of appropriate suppliers in supply chain management strategy (SCMS) is a challenging issue because it requires battery of evaluation criteria/attributes, which are characterized with complexity, elusiveness, and uncertainty in nature. This paper proposes a novel hierarchical evaluation framework to assist the expert group to select the optimal supplier in SCMS. The rationales for the evaluation framework are based upon (i) multi-criteria decision making (MCDM) analysis that can select the most appropriate alternative from a finite set of alternatives with reference to multiple conflicting criteria, (ii) analytic network process (ANP) technique that can simultaneously take into account the relationships of feedback and dependence of criteria, and (iii) choquet integral-a non-additive fuzzy integral that can eliminate the interactivity of expert subjective judgment problems. A case PCB manufacturing firm is studied and the results indicated that the proposed evaluation framework is simple and reasonable to identify the primary criteria influencing the SCMS, and it is effective to determine the optimal supplier even with the interactive and interdependent criteria/attributes. This hierarchical evaluation framework provides a complete picture in SCMS contexts to both researchers and practitioners.
International Journal of Information Management | 2011
Wei-Wen Wu; Lawrence W. Lan; Yu-Ting Lee
Software as a Service (SaaS) is regarded as a favorable solution to enhance a modern organizations IT performance and competitiveness; however, many organizations may still be reluctant to introduce SaaS solutions mainly because of the trust concern—they may perceive more risks than benefits. This paper presumes that an organization will augment the trust of adopting SaaS solutions when perceived risks decrease and/or perceived benefits increase. To gain insights into this issue, a solution framework using a modified Decision Making Trial and Evaluation Laboratory (DEMATEL) approach is proposed. The core logic is to treat perceived benefits and perceived risks as two distinct themes so that a visible cause-effect diagram can be developed to facilitate the decision makers. A case study is conducted on a Taiwanese company—one of the worlds leading manufacturers in the niche and specialized resistor markets. The findings suggest that the case company concern more about strategic-oriented benefits than economic- oriented benefits and more about subjective risks than technical risks. Some implications are addressed accordingly.
European Journal of Operational Research | 2001
Yu-Chiun Chiou; Lawrence W. Lan
Abstract This study employs genetic algorithms to solve clustering problems. Three models, SICM, STCM, CSPM, are developed according to different coding/decoding techniques. The effectiveness and efficiency of these models under varying problem sizes are analyzed in comparison to a conventional statistics clustering method (the agglomerative hierarchical clustering method). The results for small scale problems (10–50 objects) indicate that CSPM is the most effective but least efficient method, STCM is second most effective and efficient, SICM is least effective because of its long chromosome. The results for medium-to-large scale problems (50–200 objects) indicate that CSPM is still the most effective method. Furthermore, we have applied CSPM to solve an exemplified p -Median problem. The good results demonstrate that CSPM is usefully applicable.
Environmental Monitoring and Assessment | 2011
Ming-Lang Tseng; Lawrence W. Lan; Ray Wang; Anthony Chiu; Hui-Ping Cheng
Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm’s green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed.
European Journal of Operational Research | 2010
Yu-Chiun Chiou; Lawrence W. Lan; Barbara T.H. Yen
Efficiency and effectiveness for non-storable commodities represent two distinct dimensions and a joint measurement of both is necessary to fully capture the overall performance. This paper proposes two novel integrated data envelopment analysis (IDEA) approaches, the integrated Charnes, Cooper and Rhodes (ICCR) and integrated Banker, Charnes and Cooper (IBCC) models, to jointly analyze the overall performance of non-storable commodities under constant and variable returns to scale technologies. The core logic of the proposed models is simultaneously determining the virtual multipliers associated with inputs, outputs, and consumption by additive specifications for technical efficiency and service effectiveness terms with equal weights. We show that both ICCR and IBCC models possess the essential properties of rationality, uniqueness, and benchmarking power. A case analysis also demonstrates that the proposed novel IDEA approaches have higher benchmarking power than the conventional separate DEA approaches. More generalized specifications of IDEA models with unequal weights are also elaborated.
Fuzzy Sets and Systems | 2005
Yu-Chiun Chiou; Lawrence W. Lan
Logic rules and membership functions are two key components of a fuzzy logic controller (FLC). If only one component is learned, the other one is often set subjectively thus can reduce the applicability of FLC. If both components are learned simultaneously, a very long chromosome is often needed thus may deteriorate the learning performance. To avoid these shortcomings, this paper employs genetic algorithms to learn both logic rules and membership functions sequentially. We propose a bi-level iterative evolution algorithm in selecting the logic rules and tuning the membership functions for a genetic fuzzy logic controller (GFLC). The upper level is to solve the composition of logic rules using the membership functions tuned by the lower level. The lower level is to determine the shape of membership functions using the logic rules learned from the upper level. We also propose a new encoding method for tuning the membership functions to overcome the problem of too many constraints. Our proposed GFLC model is compared with other similar GFLC, artificial neural network and fuzzy neural network models, which are trained and validated by the same examples with theoretical and field-observed car-following behaviors. The results reveal that our proposed GFLC has outperformed.
Transportmetrica | 2005
Lawrence W. Lan; Erwin T.J. Lin
Conventional data envelopment analysis (DEA) approaches (e.g., CCR model, 1978; BCC model, 1984) do not adjust the environmental effects, data noise and slacks while comparing the relative efficiency of decision-making units (DMUs). Consequently, the comparison can be seriously biased because the heterogeneous DMUs are not adjusted to a common platform of operating environment and a common state of nature. Although Fried et al. (2002, Journal of Productivity Analysis, 17, 157–174) attempted to overcome this problem by proposing a three-stage DEA approach, they did not account for the slack effects and thus also led to biased comparison. In measuring the productivity growth, Färe et al. (1994, American Economic Review, 84, 66–83) proposed a method to calculate the input or output distance functions. Similarly, they did not take environmental effects, statistical noise and slacks into account and thus also resulted in biased results. To correct these shortcomings, this paper proposes a four-stage DEA approach to measure the railway transport technical efficiency and service effectiveness, and a four-stage method to measure the productivity and sales capability growths, both incorporated with environmental effects, data noise and slacks adjustment. In the empirical study, a total of 308 data points, composed of 44 worldwide railways over seven years (1995-2001), are used as the tested DMUs. The empirical results have shown strong evidence that efficiency and effectiveness scores are overestimated, and productivity and sales capability growths are also overstated, provided that the environmental effects, data noise and slacks are not adjusted. Based on our empirical findings, important policy implications are addressed and amelioration strategies for operating railways are proposed.
Transportmetrica | 2009
Jiuh-Biing Sheu; Lawrence W. Lan; Yi-San Huang
Short-term prediction of dynamic traffic states remains critical in the field of advanced traffic management systems and related areas. In this article, a novel real-time recurrent learning (RTRL) algorithm is proposed to address the above issue. We dabble in comparing pair predictability of linear method versus RTRL algorithms and simple non-linear method versus RTRL algorithms individually using a first-order autoregressive time-series AR(1) and a deterministic function. A field study tested with flow, speed and occupancy series data collected directly from dual-loop detectors on a freeway is conducted. The numerical results reveal that the performance of RTRL algorithms in predicting short-term traffic dynamics is satisfactorily accepted. Furthermore, it is found that the dynamics of short-term traffic states characterised in different time intervals, collected in diverse time lags and times of day may have significant effects on the prediction accuracy of the proposed algorithms.
Benchmarking: An International Journal | 2013
Wei-Wen Wu; Lawrence W. Lan; Yu-Ting Lee
Purpose – The purpose of this paper is to propose a benchmarking framework to evaluate the efficiency and effectiveness of the hotel industry, in a multi‐period context, with consideration of perishable traits and carry‐over activities. The sustained high performers in the case study are identified and their business strategies are discussed.Design/methodology/approach – The dynamic DEA (data envelopment analysis) approach is used to identify the multi‐period sustained high performers. The super‐efficiency DEA approach is employed to conduct a thorough ranking under an input‐output‐consumption structure. The supplementary analysis is further implemented to help elucidate the benchmarking results.Findings – In total, nine out of 80 international tourist hotels in Taiwan during 2006‐2010 are identified as the sustained high performers. These hotels have diverged business strategies in terms of employees (intensive versus economical labor forces), products (room versus F&B (food and beverage) services), pric...
Transportmetrica | 2006
Lawrence W. Lan; Yeh-Chieh Huang
This paper develops a rolling-trained fuzzy neural network (RTFNN) approach for freeway incident detection. The core logic of this approach is to establish a fuzzy neural network and to update the network parameters in response to the prevailing traffic conditions through a rolling-trained procedure. The simulation results of some thirty-six incident scenarios in a two-lane freeway mainline case study show that the proposed RTFNN approach can improve the detection performance over the fuzzy neural network approach, which is based on the same network structure but without updating the parameters through a rolling-trained procedure. The highest detection rate is found at a rolling horizon of 45 minutes and a training sample size of 90 samples in this case study.