German Molina
Credit Suisse
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by German Molina.
Journal of the American Statistical Association | 2008
Feng Liang; Rui Paulo; German Molina; Merlise A. Clyde; James O. Berger
Zellners g prior remains a popular conventional prior for use in Bayesian variable selection, despite several undesirable consistency issues. In this article we study mixtures of g priors as an alternative to default g priors that resolve many of the problems with the original formulation while maintaining the computational tractability that has made the g prior so popular. We present theoretical properties of the mixture g priors and provide real and simulated examples to compare the mixture formulation with fixed g priors, empirical Bayes approaches, and other default procedures. Please see Arnold Zellners letter and the authors response.
HANDBOOK OF QUANTITATIVE FINANCE AND RISK MANAGEMENT,Springer Verlag | 2010
German Molina; Chuan-Hsiang Han; Jean-Pierre Fouque
In this paper we propose to use Monte Carlo Markov Chain methods to estimate the parameters of Stochastic Volatility Models with several factors varying at different time scales. The originality of our approach, in contrast with classical factor models is the identification of two factors driving univariate series at well-separated time scales. This is tested with simulated data as well as foreign exchange data.
Transportation Research Record | 2004
M. J. Bayarri; James O. Berger; German Molina; Nagui M. Rouphail; Jerome Sacks
Calibrating and validating a traffic simulation model for use on a transportation network depend on field data that are often limited but essential for determining inputs to the model and for assessing its reliability. Quantification and systemization of the calibration/validation process expose statistical issues inherent in the use of such data. These issues are discussed, and a methodology to address them is described. The formalization of the calibration/validation process leads naturally to the use of Bayesian methodology for assessing uncertainties in model predictions arising from a multiplicity of sources (randomness in the simulator, statistical variability in estimating and calibrating input parameters, inaccurate data, and model discrepancy). The methods and the approach are exhibited on an urban street network with the microsimulator CORSIM, and the demand and turning movement parameters are calibrated. A discussion of how the process can be extended to deal with other model parameters as well as with the possible misspecification of the model is included. Although the methods are described in a specific context, they can be used generally, although they are inhibited at times by computational burdens that must be overcome, often by developing approximations to the simulator.
Quantitative Finance | 2012
Enrique ter Horst; Abel Rodriguez; Henryk Gzyl; German Molina
Mounting empirical evidence suggests that the observed extreme prices within a trading period can provide valuable information about the volatility of the process within that period. In this paper we define a class of stochastic volatility models that uses opening and closing prices along with the minimum and maximum prices within a trading period to infer the dynamics underlying the volatility process of asset prices and compare it with similar models presented previously in the literature. The paper also discusses sequential Monte Carlo algorithms to fit this class of models and illustrates its features using both a simulation study and real data.
Technometrics | 2005
German Molina; M. J. Bayarri; James O. Berger
CORSIM, a microsimulator for vehicular traffic, is being studied with respect to its ability to successfully model and predict behavior of traffic in a 36-block section of Chicago. Inputs to the simulator include information about street configuration, driver behavior, traffic light timing, turning probabilities at each intersection, and distributions of traffic ingress into the system. Data are available concerning the turning proportions in the actual neighborhood, as well as counts of vehicular ingress into the neighborhood and internal system counts, during a day in May 2000. Some of the data are accurate (video recordings), but some are quite inaccurate (observer counts of vehicles). Previous use of the full dataset involved “tuning” the parameters of CORSIM—in an ad hoc fashion—until CORSIM output was reasonably close to the actual data. This common approach, of simply tuning a complex computer model to real data, can result in poor parameter choices and completely ignores the often considerable uncertainty remaining in the parameters. To overcome these problems, we adopt a Bayesian approach, together with a measurement error model for the inaccurate data, to derive the posterior distribution of turning probabilities and of the parameters of the CORSIM input distribution. This posterior distribution can then be used to initialize runs of CORSIM, yielding outputs reflecting the actual uncertainty in the analysis. Determining the posterior via Markov chain Monte Carlo (MCMC) methodology is not directly feasible because of the running time of CORSIM. Fortunately, the turning probabilities and parameters of the input distribution enter CORSIM through a probability structure that can be almost exactly described by a stochastic network that does allow an MCMC analysis. The resulting MCMC has some novel features that should also be useful in dealing with general discrete network structures. The major conclusion of this study is that it is possible to incorporate uncertainty in model inputs into analyses of traffic microsimulators such as CORSIM, and that incorporating this uncertainty can significantly change the variability of engineering simulations performed with CORSIM. The second engineering conclusion, that traffic counts obtained by human observers can have very significant bias in a positive direction (corresponding to overcounting of vehicles), was unexpected.
Bayesian Analysis | 2015
Roberto Casarin; Fabrizio Leisen; German Molina; Enrique ter Horst
We build on Fackler and King (1990) and propose a general calibration model for implied risk neutral densities. Our model allows for the joint calibration of a set of densities at different maturities and dates. The model is a Bayesian dynamic beta Markov random field which allows for possible time dependence between densities with the same maturity and for dependence across maturities at the same point in time. The assumptions on the prior distribution allow us to compound the needs of model flexibility, parameter parsimony and information pooling across densities.
Mathematics and Computers in Simulation | 2014
Chuan-Hsiang Han; German Molina; Jean-Pierre Fouque
In this paper we propose to use Markov chain Monte Carlo methods to estimate the parameters of stochastic volatility models with several factors varying at different time scales. The originality of our approach, in contrast with classical factor models is the identification of two factors driving univariate series at well-separated time scales. This is tested with simulated data as well as foreign exchange data. Furthermore, we exploit the model calibration problem of implied volatility surface by postulating a computational scheme, which consists of McMC estimation and variance reduction techniques in MC/QMC simulations for option evaluation under multi-scale stochastic volatility models. Empirical studies and its extension are discussed.
Tourism planning and development | 2017
Lina Echeverri; Enrique ter Horst; German Molina; Zarifa Mohamad
ABSTRACT Most countries are concerned about the image they project in international markets. They have adopted and implemented differentiation strategies in order to stimulate tourism and economic investment. In the case of Colombias reputation, it has been built on unplanned positioning, interests and views of a few opinion leaders, political and economic instability and transformations in the productive sector. This paper outlines, using a Bayesian variable selection approach, the perception of foreign visitors and prospects on Colombias country image, and proposes a methodological framework for unvailing those driving factors. Findings of this research demonstrate that countries may be seen positively from the point of view of visitors while prospects may have a negative image of them. The results validate the hypothesis that the symbolic elements associated with a countrys image, in this case with Colombias image, should be included in the communication activities of a country branding strategy.
Journal of Theoretical and Applied Electronic Commerce Research | 2017
Silvana Dakduk; Enrique ter Horst; Zuleyma Santalla; German Molina; José Malavé
Online shopping has increasingly replaced traditional retail shopping, as a large number of consumers have adopted it on a global scale. However, while it is well established in developed countries, e-commerce is still at an early stage in emerging markets, hence there is a need to unveil which factors contributes to its adoption. The main objective of this study is to integrate the theory of planned behavior, the theory of reasoned action, and the technology acceptance model using a Bayesian approach to determine the key predictors of online purchase intention among internet users in Colombia. The results demonstrate the pertinence of the theory of reasoned action and technology acceptance model, models to explain online purchase intention, confirming that the intention to purchase online is mostly determined by the attitudes to e-commerce which, in turn, are explained by perceived usefulness, perceived ease of use, and the subjective norm related to online shopping.
Journal of The Royal Statistical Society Series A-statistics in Society | 2018
Roberto Casarin; German Molina; Enrique ter Horst
In this paper we expand the literature of risk neutral density estimation across maturities from implied volatility curves, usually estimated and interpolated through cubic smoothing splines. The risk neutral densities are computed through the second derivative as in Panigirtzoglou and Skiadopoulos (2004), which we extend through a Bayesian approach to the problem, featuring: (1) an extension to a multivariate setting across maturities and over time; (2) a flexible estimation approach for the smoothing parameter, traditionally assumed common to all assets, known and fixed across maturities and time, but now potentially different between assets and maturities, and over time; and (3) information borrowing about the implied curves and risk neutral densities not only across different option maturities, but also dynamically.