Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anil Gaba is active.

Publication


Featured researches published by Anil Gaba.


Operations Research | 2004

Modifying Variability and Correlations in Winner-Take-All Contests

Anil Gaba; Ilia Tsetlin; Robert L. Winkler

We consider contests with a fixed proportion of winners based on relative performance. Special attention is paid to winner-take-all contests, which we define as contests with relatively few winners receiving relatively large awards, but we consider the full range of values of the proportion of winners. If a contestant has the opportunity to modify the distribution of her performance, what strategy is advantageous? When the proportion of winners is less than one-half, a riskier performance distribution is preferred; when this proportion is greater than one-half, it is better to choose a less risky distribution. Using a multinormal model, we consider modifications in the variability of the distribution and in correlations with the performance of other contestants. Increasing variability and decreasing correlations lead to improved chances of winning when the proportion of winners is less than one-half, and the opposite directions should be taken for proportions greater than one-half. Thus, it is better to take chances and to attempt to distance oneself from the other contestants (i.e., to break away from the herd) when there are few winners; a more conservative, herding strategy makes sense when there are many winners. Our analytical and numerical results indicate that the probability of winning can change substantially as variability and/or correlations are modified. Furthermore, in a game-theoretic setting in which all contestants can make modifications, choosing a riskier (less risky) performance distribution when the proportion of winners is low (high) is the dominant best-response strategy. We briefly consider some practical issues related to the recommended strategies and some possible extensions.


Journal of Risk and Uncertainty | 2004

Strategic Choice of Variability in Multiround Contests and Contests with Handicaps

Ilia Tsetlin; Anil Gaba; Robert L. Winkler

Variability can be an important strategic variable in a contest. We study optimal strategies involving choice of variability in contests with fixed and probabilistic targets, one-round and multiround contests, contests with and without handicaps, and situations where one contestant can modify variability as well as those in which all contestants have this opportunity. A contestant should maximize variability in a weak position (low mean, high handicap, or low previous performance) and minimize variability in a strong position. In some cases, only these extremes should be used. In other cases, intermediate levels of variability are optimal when the contestants position is neither too weak nor too strong.


Management Science | 2013

Unpacking the Future: A Nudge Toward Wider Subjective Confidence Intervals

Kriti Jain; Kanchan Mukherjee; J. Neil Bearden; Anil Gaba

Subjective probabilistic judgments in forecasting are inevitable in many real-life domains. A common way to obtain such judgments is to assess fractiles or confidence intervals. However, these judgments tend to be systematically overconfident. Further, it has proved particularly difficult to debias such forecasts and improve the calibration. This paper proposes a simple process that systematically leads to wider confidence intervals, thus reducing overconfidence. With a series of experiments, including with professionals, we show that unpacking the distal future into intermediate more proximal futures systematically improves calibration. We refer to this phenomenon as the time unpacking effect, find it is robust to different elicitation formats, and address the possible reasons behind it. We further show that this results in better overall forecasting performance when improved calibration is traded off against less sharpness, and that substantive benefits can be obtained even from just one level of time unpacking. This paper was accepted by Teck Ho, decision analysis.


Journal of Risk and Uncertainty | 1995

The Impact of Testing Errors on Value of Information: A Quality-Control Example

Anil Gaba; Robert L. Winkler

In this article, we extend recent work on the inferential impact of errors in data to a decision-making setting. In the context of a simple quality-control example, we illustrate how errors can cause substantial reductions in the value of information from a sample and how uncertainty about error rates can lead to yet further reductions in EVSI. Moreover, we extend the notion of an equivalent error-free sample size (which indicates the reduction in effective sample size due to errors) from an inferential framework to a decision-making framework and find that as uncertainty about error-rate parameters increases, reductions in effective sample size are even greater for a decision maker than the inferential measures suggest.


Decision Analysis | 2017

Combining Interval Forecasts

Anil Gaba; Ilia Tsetlin; Robert L. Winkler

When combining forecasts, a simple average of the forecasts performs well, often better than more sophisticated methods. In a prescriptive spirit, we consider some other parsimonious, easy-to-use heuristics for combining interval forecasts and compare their performance with the benchmark provided by the simple average, using simulations from a model we develop and data sets with forecasts made by professionals in their domain of expertise. We find that the empirical results closely match the results from our model, thus providing some validation for the theoretical model. The relative performance of the heuristics is influenced by the degree of overconfidence in and dependence among the individual forecasts, and different heuristics come out on top under different circumstances. The results provide some good, easy-to-use alternatives to the simple average with an indication of the conditions under which each might be preferable, enabling us to conclude with some prescriptive advice.


Management Science | 2018

Assessing Uncertainty from Point Forecasts

Anil Gaba; Dana G. Popescu; Zhi Chen

The paper develops a model for combining point forecasts into a predictive distribution for a variable of interest. Our approach allows for point forecasts to be correlated and admits uncertainty on the distribution parameters given the forecasts. Further, it provides an easy way to compute an augmentation factor needed to equate the dispersion of the point forecasts to that of the predictive distribution, which depends on the correlation between the point forecasts and on the number of forecasts. We show that ignoring dependence or parameter uncertainty can lead to assuming an unrealistically narrow predictive distribution. We further illustrate the implications in a newsvendor context, where our model leads to an order quantity that has higher variance but is biased in the less costly direction, and generates an increase in expected profit relative to other methods. The e-companion is available at https://doi.org/10.1287/mnsc.2017.2936. This paper was accepted by Vishal Gaur, operations management.


Organization Science | 1998

Coda : Creativity and Improvisation in Jazzand Organizations: Implications Fororganizational Learning

Anil Gaba; W. Kip Viscusi; Alan D. Meyer; Frank J. Barrett


International Journal of Forecasting | 2009

Forecasting and uncertainty in the economic and business world

Spyros Makridakis; Robin M. Hogarth; Anil Gaba


Marketing Science | 1999

Risk Behavior in Response to Quotas and Contests

Anil Gaba; Ajay Kalra


Management Science | 1992

Implications of errors in survey data: a Bayesian model

Anil Gaba; Robert L. Winkler

Collaboration


Dive into the Anil Gaba's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge