Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Marvin D. Troutt is active.

Publication


Featured researches published by Marvin D. Troutt.


Computers & Industrial Engineering | 2001

Applications of genetic search and simulated annealing to the two-dimensional non-guillotine cutting stock problem

T.W Leung; C.H Yung; Marvin D. Troutt

We applied a genetic algorithm and a simulated annealing approach to the two-dimensional non-guillotine cutting stock problem and carried out experimentation on several test cases. The performance and efficiency of these two heuristic algorithms on this problem were compared.


Journal of Knowledge Management | 2002

A review of naturalistic decision making research with some implications for knowledge management

Peter Meso; Marvin D. Troutt; Justyna Rudnicka

In the last decade naturalistic decision making has been pursued by cognitive psychologists. The focus is on how human experts make decisions under conditions of time pressure and complexity; how they organize and use their knowledge is expected to provide principles for the emerging science of knowledge management. This paper surveys this research and discusses results, which indicate more attention needs to be given to: problem formulation; asking the right questions; use of teams; organization of knowledge; expanding scope of expert systems and case‐based reasoning. Also the method, cognitive task analysis, which is generally used in naturalistic decision making is readily adaptable to business knowledge management.


Journal of Applied Meteorology | 2001

Estimation of Wind Speed Distribution Using Markov Chain Monte Carlo Techniques

Wan-Kai Pang; Jonathan J. Forster; Marvin D. Troutt

The Weibull distribution is the most commonly used statistical distribution for describing wind speed data. Maximum likelihood has traditionally been the main method of estimation for Weibull parameters. In this paper, Markov chain Monte Carlo techniques are used to carry out a Bayesian estimation procedure using wind speed data obtained from the Observatory of Hong Kong. The method is extremely flexible. Inference for any quantity of interest is routinely available, and it can be adapted easily when data are truncated.


Online Information Review | 2009

The effective use of technology in personal knowledge management: A framework of skills, tools and user context

Raj Agnihotri; Marvin D. Troutt

Purpose – The objective of this paper is to further explore the emerging concept of personal knowledge management (PKM) and to bring researchers’ attention to this notion. Specifically, this paper aims to address issues related to the effective utilisation of technology in PKM practices.Design/methodology/approach – A theoretical framework incorporating PKM skills, technology tools, user context and skills‐tools fit is proposed. Arguments are built on the task‐technology fit theory, which explores the link between technology tools and task characteristics (PKM skills).Findings – The impact of effective PKM will depend increasingly on skills‐tools fit.Practical implications – The success of technology utilisation resides not simply in whether individuals use technology, but if this usage actually improves effectiveness. For their own benefit, individuals should consider and assess the technology tools in the context of how they will be aligned with specific PKM skills.Originality/value – Proposing a concep...


European Journal of Operational Research | 2009

Objective comparisons of the optimal portfolios corresponding to different utility functions

Bosco Yu; Wan-Kai Pang; Marvin D. Troutt; Shui Hung Hou

This paper considers the effects of some frequently used utility functions in portfolio selection by comparing the optimal investment outcomes corresponding to these utility functions. Assets are assumed to form a complete market of the Black-Scholes type. Under consideration are four frequently used utility functions: the power, logarithm, exponential and quadratic utility functions. To make objective comparisons, the optimal terminal wealths are derived by integration representation. The optimal strategies which yield optimal values are obtained by the integration representation of a Brownian martingale. The explicit strategy for the quadratic utility function is new. The strategies for other utility functions such as the power and the logarithm utility functions obtained this way coincide with known results obtained from Mertons dynamic programming approach.


Management Science | 2006

Behavioral Estimation of Mathematical Programming Objective Function Coefficients

Marvin D. Troutt; Wan-Kai Pang; S H Hou

We propose a parameter estimation method based on what we call the minimum decisional regret principle. We focus on mathematical programming models with objective functions that depend linearly on costs or other parameters. The approach is illustrated for cost estimation in production planning using linear programming models. The method uses past planning data to estimate costs that are otherwise difficult to estimate. We define a monetary measure of distance between observed plans and optimal ones, called decisional regret. The proposed estimation algorithm finds parameter values for which the associated optimal plans are as near as possible to the observed ones on average. Such techniques may be called behavioral estimation because they are based on the observed planning or decision-making behavior of managers or firms. Two numerical illustrations are given. A supporting hyperplane algorithm is used to solve the estimation model. A method is proposed for obtaining range estimates of the parameters when multiple alternative estimates exist. We also propose a new validation approach for this estimation principle, which we call the target-mode agreement criterion.


European Journal of Operational Research | 2008

A simulation-based approach to the study of coefficient of variation of dividend yields

Wan-Kai Pang; Bosco Yu; Marvin D. Troutt; Shui Hung Hou

Existing empirical studies of dividend yields and dividend policies either make no assumption or the normal distribution of the dividend yields data. The statistical results will be biased because they cannot reflect the finite support set property of dividend yields which can only range from 0 to 1. We posit that the assumption that dividend yields follow a beta distribution is more appropriate. The coefficient of variation (CV) is used to measure the stability of dividend yields. If we assume dividend yields follow a normal distribution, then the maximum likelihood estimate for coefficient of variation is given by . This only gives us a point estimate, which cannot depict the full picture of the sampling distribution of the coefficient of variation. A simulation-based approach is adopted to estimate CV under the beta distribution. This approach will give us a point estimate as well as the empirical sampling distribution of CV. With this approach, we study the stability of dividend yields of the Hang Seng index and its sub-indexes of the Hong Kong stock market and compare the results with the traditional approach.


European Journal of Operational Research | 2011

DEA based dimensionality reduction for classification problems satisfying strict non-satiety assumption

Parag C. Pendharkar; Marvin D. Troutt

This study shows how data envelopment analysis (DEA) can be used to reduce vertical dimensionality of certain data mining databases. The study illustrates basic concepts using a real-world graduate admissions decision task. It is well known that cost sensitive mixed integer programming (MIP) problems are NP-complete. This study shows that heuristic solutions for cost sensitive classification problems can be obtained by solving a simple goal programming problem by that reduces the vertical dimension of the original learning dataset. Using simulated datasets and a misclassification cost performance metric, the performance of proposed goal programming heuristic is compared with the extended DEA-discriminant analysis MIP approach. The holdout sample results of our experiments shows that the proposed heuristic approach outperforms the extended DEA-discriminant analysis MIP approach.


Journal of Systems and Software | 2006

The knowledge management efficacy of matching information systems development methodologies with application characteristics-an experimental study

Peter Meso; Gregory R. Madey; Marvin D. Troutt; Jens Liegle

An experimental study was conducted to determine whether appropriately matching methodology type to the characteristics of the business application being developed resulted in more effective knowledge-work processes among team members. Specifically, the experiment compared the use of hypermedia systems development methodologies to that of conventional software engineering methodologies in enabling knowledge work processes during the development of hypermedia-intensive business applications. Results obtained indicate that there is value in effectively matching methodologies to application domain. Based on this finding, there is justification in employing strong-typed methodologies for systems design projects particularly in the cases where the application domain is quite specialized. Our results also suggest that Information Technology (IT) studies that assess methodology influences on the resultant Information System (IS) artifact need to include knowledge and/or cognitive elements and their related theories. Since systems development is knowledge intensive, inclusion of cognitive and knowledge aspects provide a more complete model of how methodologies influence the various aspects of the IS artifact.


European Journal of Operational Research | 2010

An empirical method for assessing the research relevance gap

Suvankar Ghosh; Marvin D. Troutt; John H. Thornton; O. Felix Offodile

There has been much debate on the relevance to firms of the academic research produced by business schools. However, what has not received as much attention is how the relevance of the research to businesses should be measured in a systematic and empirical way. We develop a systematic method to test for the relevance of academic research to businesses. Our method models as a vector autoregressive process the interests of the academic and practitioner communities in some new topic, as expressed by the number of articles published in the academic and the practitioner literature on that topic per calendar quarter, and then studies Granger causality between the academic and practitioner interest processes. This method can be used by academics to empirically demonstrate the impact of their intellectual contributions on practitioners and thence on the business world. We employ our approach to two relatively new and important topics, Real Options and Economic Value Added.

Collaboration


Dive into the Marvin D. Troutt's collaboration.

Top Co-Authors

Avatar

Wan-Kai Pang

Hong Kong Polytechnic University

View shared research outputs
Top Co-Authors

Avatar

Peter Meso

Georgia State University

View shared research outputs
Top Co-Authors

Avatar

Siddhartha Bhattacharyya

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bosco Yu

Hong Kong Polytechnic University

View shared research outputs
Top Co-Authors

Avatar

S H Hou

Hong Kong Polytechnic University

View shared research outputs
Top Co-Authors

Avatar

Shui Hung Hou

Hong Kong Polytechnic University

View shared research outputs
Top Co-Authors

Avatar

O. Felix Offodile

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge