Discussion of "Geodesic Monte Carlo on Embedded Manifolds"
Simon Byrne, Mark Girolami, Persi Diaconis, Christof Seiler, Susan Holmes, Ian L. Dryden, John T. Kent, Marcelo Pereyra, Babak Shahbaba, Shiwei Lan, Jeffrey Streets, Daniel Simpson
DDiscussion of “Geodesic Monte Carlo onEmbedded Manifolds”
Comment: Connections and Extensions
Persi Diaconis, Christof Seiler and Susan Holmes Historical Context
We welcome this paper of Byrne and Girolami [BG]; it breathes even more lifeinto the emerging area of hybrid Monte Carlo Markov chains by introducingoriginal tools for dealing with Monte Carlo simulations on constrained spacessuch as manifolds. We begin our comment with a bit of history. Using geodesicsto sample from the uniform distribution on Stiefel manifold was proposed byAsimov (1985) in his work on the Grand Tour for exploratory data analysis. Fordata x , x , . . . , x n in R p , it is natural to inspect low dimensional projections γx , γx , . . . , γx n for γ : R p −→ R k . In the [BG] paper the authors have a spaceof k-frames in R p , called V k,p . If one chooses γ at random from this space, theviews would be too ‘disconnected’ or ‘jerky’ for human observers. A better tacticturned out to be to choose a few γ i , ≤ i ≤ L , at random and then movingsmoothly from γ i to γ j by available closed form geodesics. While in a historicalmode, we point to the little known papers of McLachlan and Quispel (2003) andmore recent papers by Betancourt on hybrid Monte Carlo (Betancourt, 2013). Discrete Hamiltonian Dynamics
The paper of [BG] uses Hamiltonian dynamics to move around on a manifoldin an intelligent way to get proposals for the Metropolis algorithm. There arealso many problems where samples are needed for constrained discrete spaces.These include sampling contingency tables with given row and column sums asin Diaconis and Sturmfels (1998). We recently encountered the following prob-lem in a quantum physics context (Chatterjee and Diaconis, 2013). Considerboxes labeled (1 , , , . . . ). Drop N balls into these boxes according to Bose-Einstein allocation resulting in N i balls in box labeled i . Interest is on samplesconditional on (cid:80) i N i i = E . This is a discrete version of the author’s sam-pling from simplices and spheres. We do not currently have discrete versions ofHamiltonian dynamics apart from numerical schemes (leapfrog) that are used tosolve the resulting differential equations as proposed by Neal (2011). In contrast,[BG] compute the dynamics by splitting up the Hamiltonian into two analyti-cally solvable parts. We wonder whether the author’s can suggest adaptions oftheir ideas to the discrete framework. Statistics Department, Stanford University, CA 94305 a r X i v : . [ s t a t . C O ] N ov on-Smooth Manifolds [BG] start with the Hausdorff measure from geometric measure theory (Diaco-nis, Holmes, and Shahshahani, 2012; Federer, 1969; Morgan, 2009) as a generalway to define surface areas for non-smooth manifolds in arbitrary dimensions.We wonder if this is a bit misleading, since all subsequent developments andexamples in the paper focus on homogeneous smooth manifolds.One example for which the methodology presented runs into difficulties isthe barbell (Grayson, 1989), parametrized as: B (cid:18) xθ (cid:19) = xf ( x ) cos( θ ) f ( x ) sin( θ ) , ≤ θ < π, with changing radius: f ( x ) = (cid:40) r cosh (cid:16) | x |− lr (cid:17) if | x | > lr otherwise.The difficulties arise at the corner of the transition from the bar to the bellsection in the first coordinate of B at position | x | = l . The derivative at thesepoint is not defined. In contrast, the geometric measure theory approach handlessuch difficulties by realizing that sets of area 0 do not influence the integral overa manifold. The intuition is that the line dividing the bar and the bell is aline which is negligible for computing two-dimensional integrals. Following thisapproach as described in Diaconis, Holmes, and Shahshahani (2012), we sample x from the unnormalized surface measure: (cid:115) det (cid:20) D B (cid:18) xθ (cid:19)(cid:21) T (cid:20) D B (cid:18) xθ (cid:19)(cid:21) = (cid:40) r cosh (cid:16) | x |− lr (cid:17) if | x | > lr otherwise.The R code snippet (Code 1) generates samples using rejection sampling forparameter x . Code 1
Rejection sampling yielding x . n = 5e3; r = 1; l = 2; L = 4xprop = runif(n, min = -L, max = L)eta = runif(n, min = 0, max = (r * cosh((abs(L) - l)/r)^2))x = c()for (i in 1:length(xprop)) {if (abs(xprop[i]) > l) {if (eta[i] < (r * cosh((abs(xprop[i]) - l)/r)^2)) {x = c(x, xprop[i])}} else {if (eta[i] < r) {x = c(x, xprop[i]) }}} −4 −2 0 2 4 − − − − x2 −4 −2 0 2 4 −4 −2 0 2 4 − − x3 Figure 1: The barbell is an example of a non-smooth manifold.From these samples, and θ drawn uniformly between 0 and 2 π , we can plotthe barbell with points uniformly distributed with respect to its surface area(Figure 1). If we sampled points uniformly from the parameters x , we wouldobtain higher point density on the bar than on the bell section due to highercurvatures.We are curious to know why [BG] decided to include the geometric measuretheory part in the introduction and not to just simply focused on Riemannianmanifolds and the Riemmanian volume form. Consistency of Bayes Estimates on Manifolds.
Two different philosophical view points are crucial to study the consistencyof Bayes estimates, namely “classical” and “subjectivistic”. The classical viewpoint studies the consistency of Bayes estimates assuming the existence of a fixedunderlying parameter. In this context, we consider the posterior Bayes estimateto be consistent w.r.t. a prior if it converges to the underlying parameter as the3umber of imaginary observations tends to infinity.On the other hand, the subjectivistic view point is nihilistic of a fixed un-derlying parameter. In this context, we rather evaluate if two different priorscreated by two different imaginary statisticians converge to the same posteriorestimate as the number of imaginary observations tends to infinity. We cananalyze the derivative of the map that sends the prior to the posterior measure.This helps to evaluate how the posterior reacts to small changes in the prior.In this fashion, we can study an infinite amount of imaginary statisticians andhow their beliefs affect the outcome of Bayesian analysis.We introduced these concept in Diaconis and Freedman (1986) for Euclideanspaces, and we are interested in how these results translate to the case of smoothand non-smooth manifolds. Some initial work towards addressing these ques-tions can be found in Bhattacharya and Dunson (2012).
Manifold and Metric Learning
In the absence of a manifold parametrization we might want to estimate it fromdata. Recent advances by Perraul-Joncas and Meilˆa (2013) on unifying manifoldlearning methods into a consistent framework by learning the Riemannian metricin addition to the manifold and its embedding are promising but build upon theassumption of uniform sampling density on the manifold (Belkin and Niyogi,2007; von Luxburg, Belkin, and Bousquet, 2008). But what if the sampling ofthe data is not related to the geometry of the manifold? In this case, we wantto find the manifold that is consistent for a family of distributions for a givenset of data points. From a Bayesian perspective, we could study non-uniformdensity distributions on the manifold through the derivative of the map fromprior to posterior measure analog to consistency evaluations of Bayes estimates.
Applications in Computational Anatomy
Among many potential fields of application, we would like to highlight com-putational anatomy (Marsland et al., 2012; Miller, 2004; Younes, 2010). Themain goal of computational anatomy is to compare shapes of organs (e.g. brain,heart and spine) observed from computed tomography (CT) and magnetic res-onance imaging (MRI). Statistical analysis of shape differences can be useful tounderstand disease related changes of anatomical structures. The key idea isto estimate transformations between a template and patient anatomies. Thesetransformation encode the structural differences in a population of patients.There is a wide range of groups of transformations that have been studied,ranging from rigid rotations to infinite dimensional groups of diffeomorphisms.What elements across groups have in common is that they do not live in Eu-clidean space but on more general manifolds. Currently, most transformationestimators are based on optimization of a cost function. In the future, we en-vision Bayesian approaches along the line of Seiler, Pennec, and Holmes, 2013with the help of methodologies proposed in this paper.
Future Directions
The paper suggests new research questions: how long should the new algorithmsbe run to ensure that the resulting distributions are usefully close to their sta-tionary distribution? We haven’t seen any careful analysis of Hybrid Monte4arlo in continuous problems (we mean quantitative, non-asymptotic boundsas in Jones and Hobert (2001)). A first effort was made in a toy problem inDiaconis, Holmes, and Neal (2000).The authors work with ‘nice manifolds’, often manifolds are only givenimplicitly, with local coordinate patches. Our work (Diaconis, Holmes, andShahshahani, 2012) did not deal with this problem, we would love to have helpfrom the authors to make progress in these types of applications.
Comment
Ian L. Dryden The authors have introduced an interesting and mathematically intricate methodfor Markov chain Monte Carlo simulation on an embedded manifold. Thegeodesic Monte Carlo (MC) method provides large proposals as part of thescheme, which are devised by careful study of the Riemannian geometry of thespace and the geodesics in particular. The aim of the resulting algorithm isto produce a chain with low autocorrelation and high acceptance probabilities.As displayed by the authors, the method is well geared up for simulating fromunimodal distributions on a manifold via the gradient of the log-density and thegeodesic flow. They also demonstrate its effective use in multimodal scenariosvia parallel tempering. Given that there are always many choices of embedding,should one choose as low dimensional embedding as possible?There are various levels of approximation in the algorithm and so it is worthexploring in any specific application if simpler algorithms can end up provid-ing more efficient or more accurate simulations. Consider the Fisher-Binghamexample, and recall that the Fisher-Bingham ( c, A ) distribution can be definedas { X |(cid:107) X (cid:107) = 1 } where X ∼ N p ( µ, Σ) , with µ = − ( A + aI p ) − c, Σ = − ( A + aI p ) − , a is chosen such that ( A + aI p )is negative definite (see Mardia and Jupp, 2000, p.175) and I p is the p × p identity matrix. Since the Fisher-Bingham density is unchanged by adding aI p to A , we can, for example, choose a such that trace(Σ) = 1. The integratingconstant of the Fisher-Bingham can be expressed in terms of the density of alinear combination of noncentral χ random variables (Kume and Wood, 2005),which can be evaluated using a saddlepoint approximation. Hence simulationvia rejection methods is feasible.An even simpler approach when c is small could be to simulate from Y ∼ N p ( µ, Σ), and then keep only the observations that fall within |(cid:107) Y (cid:107) − | < ν , forsmall ν >
0. This naive conditioning method might appear rather inefficient,but the accepted observations are independent draws. Note that if the dimension p is large and X Bingham distributed with trace(Σ) = 1 , trace(Σ ) ≈ c = 0then from Dryden (2005) we have the approximation X ≈ N p (0 , Σ). Hence, evenfor large p this can still be a practical method for certain Σ. In Figure 2 weshow the results of this algorithm in the example from Section 5.1 of the paper,with c = 0 and with 2 billion proposals and ν = 2 × − . Here a = − . . School of Mathematical Sciences, University of Nottingham istogram of x x F r equen cy − − Figure 2: Simulated values of x for the Fisher-Bingham example with c = 0.There are 6588 simulated values from 2 billion proposals.There is always a trade-off with any simulation method, and one needs tocompromise between the level of approximation (through ν here), the efficiencyin run time, the independence of observations and the amount of coding involvedin the implementation. For this Bingham example the naive conditional methodseems reasonable here, giving independent, near exact realisations and veryminimal effort in coding. However, the beauty of the geodesic MC method ofthe paper is that the algorithm is quite general, and so can be tried out in arange of scenarios where there may be no reasonable alternative. Comment
John T. Kent Statistical distributions on manifolds have become an increasingly importantcomponent of geometrically-motivated high-dimensional sophisticated statisti-cal models in recent years. For example, Green and Mardia (2006) used thematrix Fisher distribution for random 3 × R , with an application to a problem of protein alignment in bioinformat-ics. MCMC simulations often form the standard methodology for fitting suchhigh-dimensional models. Hence there is a growing interest in developing ef-ficient and general methods for simulating distributions on manifolds in theirown right. The paper makes a very valuable contribution in this area.However, although MCMC is a very general and very powerful methodology,it is inherently potentially slow and cumbersome to use in practice, due to theformal need to run a Markov chain to convergence. Hence when quicker alter-natives (such as acceptance rejection algorithms) are available, it is importantto be aware of them. Department of Statistics, University of Leeds, Leeds LS2 9JT, UK × SO (3) and S , the unit sphere in 4 dimensions, and it also follows that the matrixFisher distribution on SO (3) can be identified with the Bingham distributionon S . Hence the new method for the Bingham distribution can be used directlyfor the matrix Fisher in this setting. It can be shown that the efficiency of thisnew acceptance rejection simulation method is very respectable; it is boundedbelow by 45% for all values of the parameters. More details can be found inKent, Ganeiber, and Mardia (2013).It must be conceded that this new acceptance rejection methodology is nota panacea. In particular, for product manifolds there is often currently no alter-native to MCMC. But for the simpler cases, the acceptance rejection methodscan be very effective. Comment
Marcelo Pereyra I congratulate the authors for an interesting paper and an important method-ological contribution to the problem of sampling from probability distributionson manifolds. As an image processing researcher I shall restrict my commentsto the potential of the proposed methodology for statistical signal and imageprocessing. There are numerous new and exciting signal and image processingapplications that require performing statistical inference on parameter spacesconstrained to submanifolds of R n and for which the proposed HMC algorithmis potentially interesting. For example, there are many unmixing or sourceseparation problems that require estimating parameters that, because of phys-ical considerations, are subject to positivity and sum-to-one constraints (i.e.constrained to a simplex) (Golbabaee, Arberet, and Vandergheynst, 2012). Forinstance, the estimation of abundances (or proportions) of different materialsand substances within the pixels of a satellite hyperspectral image (Bioucas-Dias et al., 2012). These images are increasingly used in environmental sciencesto monitor the evolution of vegetation in rainforests and in agriculture to fore-cast crop yield. Similar spectral imaging technologies are now used in materialscience and chemical analysis (Dobigeon and Brun, 2012). Moreover, anotherimportant example of signal processing on manifolds is dictionary learning forsparse signal representation and compressed sensing, which involves estimatinga set of orthonormal vectors constrained to a Stiefel manifold (Dobigeon and Department of Mathematics, University of Bristol
Comment
Babak Shahbaba , Shiwei Lan and Jeffrey Streets We would like to start by congratulating Byrne and Girolami for writing sucha thoughtful and extremely interesting paper. This is in fact a worthy additionto other high impact papers recently published by Professor Griolami’s lab inthis field. The common theme of these papers is to use geometrically moti-vated methods to improve efficiency of sampling algorithms. In their seminalpaper, Girolami and Calderhead (2011) propose a novel HMC method, calledRiemannian Manifold Hamiltonian Monte Carlo (RMHMC), that adapts to thelocal geometry of the parameter space. While this is a natural and beautifulidea, there are significant computational difficulties which arise in effectively im-plementing this algorithm. In contrast, in this current contribution, Byrne and Department of Statistics and Department of Computer Science, University of California,Irvine, USA. Department of Statistics, University of California, Irvine, USA. Department of Mathematics, University of California, Irvine, USA. lll lll lllll ll ll ll ll llll llll ll llll ll lll lllll ll ll ll ll ll llll ll llll ll llll ll ll llll ll ll ll ll ll llll lll lllll ll llll ll ll ll llll llll ll ll llll ll ll ll ll llll ll llll ll llll ll ll ll ll ll ll llll ll llll l l llll llll llll ll ll ll ll ll ll llll ll ll llll llll ll ll ll ll ll llll ll ll llll ll ll ll llll ll ll ll llll ll ll ll ll ll ll ll ll ll lll lllll ll llll ll ll ll llll ll llll llll ll ll ll ll ll ll ll ll ll ll llll llll llll ll ll llll ll ll ll ll llll ll llll llll ll ll ll llll llll ll ll ll ll ll llll ll llll ll ll ll llll ll llll ll llll llll llll llll ll ll ll ll ll ll ll ll ll ll llll llll ll ll ll ll llll l l llll ll ll llll llll ll ll ll ll llll ll ll ll ll llll ll ll llll ll ll llll llll ll ll llll ll ll llll ll ll l l ll ll llll ll ll llll llll ll ll ll ll llll ll ll llll ll ll ll ll ll ll ll ll llll ll ll ll ll ll ll ll llll ll ll ll llll l lllll llll ll ll llll llll ll ll llll llll ll ll ll ll llll ll ll ll ll llll ll ll ll ll ll ll llll ll llll ll ll ll ll ll ll ll ll llll ll ll ll ll llll ll ll ll ll llll llll ll ll ll ll ll ll llll ll ll ll ll llll llll ll ll llll ll ll ll ll llll ll ll llll ll ll ll llll ll ll llll ll ll ll ll llll ll llll llll ll ll ll ll ll ll ll ll ll llll ll ll ll ll llll ll ll ll ll ll ll llll ll ll ll ll ll ll ll llll llll llll ll ll ll ll llll llll llll llll ll ll ll ll llll ll ll ll ll ll ll ll ll ll ll ll ll ll ll llll ll llll ll ll ll ll ll ll llll llll ll ll llll ll ll llll llll ll ll llll ll llll ll llll ll ll ll ll ll ll ll ll ll ll llll ll ll ll ll llll llll ll ll llll ll llll ll ll ll ll ll ll ll llll l lll ll ll ll llll ll ll llll ll ll llll ll ll llll ll ll ll ll ll llll ll llll ll ll llll ll ll llll ll ll ll ll ll ll ll ll ll ll ll llll ll ll ll ll ll ll ll llll llll llll llll ll ll ll llll ll ll llll llll ll ll llll llll ll ll ll ll llll ll llll ll llll ll ll ll ll ll ll llll ll ll ll ll llll ll ll ll ll ll ll ll ll llll ll llll lll lllll ll llll llll ll ll llll ll ll ll ll ll ll ll llll ll ll llll ll ll ll llll ll llll ll ll ll ll ll ll ll ll ll ll ll ll ll ll llll llll llll llll ll ll ll ll ll ll ll ll llll ll llll ll ll ll ll ll ll ll ll ll ll llll llll ll ll ll ll llll ll llll ll llll llll ll ll llll ll ll llll ll ll ll ll ll ll llll ll ll ll ll llll ll ll ll ll llll llll ll ll ll ll l l llll llll ll ll llll ll ll ll llll ll ll ll llll ll ll llll ll ll llll llll llll llll ll ll ll ll ll ll llll ll ll ll ll ll ll ll ll ll ll ll ll llll llll llll ll ll ll ll ll ll llll ll ll llll llll ll ll ll ll ll llll llll ll ll llll ll ll llll llll ll ll ll ll llll ll ll llll llll llll ll ll ll ll ll ll ll ll ll ll llll ll ll ll llll llll ll llll ll ll ll ll llll llll ll ll llll llll llll l lll ll llll ll ll ll ll ll ll llll ll ll ll ll llll llll llll llll ll ll llll ll ll ll ll ll ll ll ll ll ll llll ll ll ll ll ll ll llll ll ll llll llll ll llll llll ll ll ll llll ll ll ll ll ll ll llll ll ll ll ll ll ll ll ll ll ll ll ll ll llll llll ll llll llll ll llll ll ll ll ll ll ll ll ll llll ll ll ll ll ll ll llll llll ll ll llll ll ll llll ll ll llll ll ll llll llll ll ll ll ll ll ll ll ll llll llll ll ll ll ll ll ll l lllll ll ll ll ll llll llll ll ll llll ll ll llll ll ll ll ll ll ll llll ll llll ll llll ll llll ll ll ll ll ll ll ll ll ll ll ll ll llll ll llll ll ll llll ll ll llll ll ll ll ll ll ll ll ll ll ll llll ll ll llll ll ll ll ll ll llll ll ll llll ll ll llll llll llll llll llll ll ll ll ll ll ll ll ll ll ll llll llll ll ll ll ll ll ll llll ll ll llll ll ll ll ll llll ll ll ll llll ll llll ll ll ll ll ll ll ll ll llll ll lll l llll llll ll ll llll llll ll llll ll llll ll ll ll ll ll ll llll ll llll ll ll ll ll ll llll ll ll ll ll ll ll llll ll ll ll ll ll ll ll ll ll ll llll ll ll llll ll ll llll llll ll ll ll ll ll ll ll ll ll ll ll ll ll ll llll llll llll ll llll llll ll ll ll ll ll ll llll llll llll ll ll ll ll llll llll llll ll ll ll ll ll ll ll ll ll ll llll llll ll ll ll ll ll ll ll ll llll ll ll ll ll ll ll ll ll llll llll ll ll ll ll ll ll ll llll ll ll ll ll llll ll llll ll ll ll ll ll ll ll llll ll ll ll ll llll llll llll ll ll ll ll ll ll llll ll ll ll ll ll llll llll ll ll ll ll llll llll ll ll ll ll ll ll ll ll ll ll ll llll llll ll ll llll llll llll llll llll ll ll ll ll ll ll ll ll llll ll ll ll ll ll llll ll ll ll ll llll llll ll ll ll ll llll ll ll llll ll ll llll ll ll ll ll llll ll ll llll ll ll ll ll llll llll ll ll ll ll ll ll ll ll llll ll ll ll llll llll ll ll llll llll ll llll ll llll llll ll ll ll ll ll ll ll ll ll ll ll ll ll llll ll ll ll ll ll ll ll llll ll ll ll llll ll llll ll llll llll ll llll ll llll ll ll ll llll ll ll ll ll llll ll ll ll ll ll ll ll ll ll ll ll ll llll ll ll llll llll ll ll llll ll ll llll llll llll llll ll ll ll ll ll ll llll ll ll ll ll ll ll ll ll llll llll ll ll llll ll ll ll ll ll llll ll ll llll ll ll llll llll ll ll llll ll ll llll ll ll ll ll ll ll llll ll ll ll ll ll ll ll ll ll llll llll llll ll ll ll llll ll ll ll llll llll llll llll llll llll ll ll ll ll ll ll ll ll ll ll ll ll ll llll l lll ll ll ll llll llll ll ll llll ll ll llll ll llll ll llll llll ll ll ll ll ll ll llll ll ll llll ll ll ll ll ll ll llll ll ll llll llll ll ll ll ll llll llll ll ll ll ll ll ll llll ll ll llll llll ll ll llll ll ll ll ll llll ll ll ll ll ll ll ll ll ll ll ll ll ll ll ll ll ll llll llll llll ll ll ll ll llll llll ll ll llll ll ll llll ll ll llll ll ll ll ll ll ll ll ll llll ll ll llll llll llll llll ll ll llll ll ll ll ll ll ll llll llll llll ll ll ll ll llll llll ll ll ll ll ll ll llll ll ll ll ll l l llll llll ll ll ll ll ll ll ll llll ll llll llll llll llll ll ll ll ll ll ll ll ll llll ll ll llll ll ll ll ll ll ll llll ll ll ll ll llll llll llll llll llll ll ll llll ll ll llll ll ll ll ll ll ll ll ll llll llll ll ll ll ll llll llll llll ll ll llll llll ll llll llll ll ll ll ll ll ll llll ll ll ll ll llll ll llll llll ll llll ll ll ll ll llll ll ll llll llll ll ll ll ll ll ll ll ll ll ll llll ll ll llll llll llll ll ll llll llll llll llll llll ll ll llll ll ll ll ll ll ll ll ll llll ll ll l l llll llll llll llll llll ll ll ll llll ll ll ll ll ll ll ll llll ll ll llll ll ll llll llll llll ll ll llll llll ll ll ll ll ll ll ll ll ll ll ll ll llll ll ll ll ll llll ll ll llll ll ll llll llll llll llll ll ll llll llll llll ll ll llll llll llll ll ll ll ll ll ll ll llll llll llll llll ll ll ll ll ll ll llll ll ll llll ll llll ll ll ll ll ll ll ll llll ll ll ll ll llll llll llll ll ll llll ll ll ll ll ll ll ll ll ll ll ll ll ll ll ll ll llll ll ll llll ll ll ll ll llll llll llll l l llll llll llll llll ll ll llll llll llll llll ll ll llll ll ll llll ll ll ll ll llll ll ll ll ll llll llll llll ll ll llll ll ll ll ll ll ll ll ll llll ll ll ll ll llll ll ll llll llll llll ll ll ll ll ll ll llll llll ll ll ll ll ll ll ll ll ll ll ll llll llll ll ll llll llll llll ll ll ll ll llll ll ll llll llll llll llll llll llll ll ll llll ll ll llll ll ll llll llll ll ll llll ll ll ll ll ll ll llll llll llll ll ll ll ll llll llll ll llll ll ll ll ll ll ll ll llll llll ll ll llll ll ll ll ll ll ll ll lll l ll ll llll ll ll llll llll ll llll ll ll ll llll llll llll llll llll ll ll ll ll ll ll ll ll ll ll ll llll ll llll ll ll ll ll ll ll llll llll ll ll ll llll llll ll ll ll ll ll llll ll ll llll ll ll ll ll ll ll llll ll ll llll ll ll ll ll ll ll ll ll llll ll ll ll ll llll llll ll llll ll ll llll ll ll ll ll llll ll ll ll ll llll ll ll llll llll ll ll llll ll ll ll ll ll ll ll ll llll ll ll ll ll ll ll ll ll ll ll ll ll llll llll ll ll llll llll llll llll llll llll ll ll ll ll ll ll llll l l llll llll llll ll ll llll ll ll ll ll llll llll llll llll llll ll ll ll ll ll ll ll ll llll llll llll ll ll ll ll ll llll ll ll ll ll ll ll ll ll ll llll ll ll llll llll ll ll ll llll ll ll ll ll ll llll ll ll llll ll ll ll ll ll llll llll llll llll ll ll ll ll llll ll ll ll ll llll ll ll ll ll ll ll llll ll ll llll ll llll llll ll llll ll ll llll ll ll ll llll llll ll ll ll ll ll ll ll ll ll ll ll llll ll llll ll ll ll ll ll llll ll ll llll ll ll llll ll ll ll ll ll ll ll llll ll ll llll ll ll llll llll llll ll ll llll llll ll ll ll llll ll ll ll ll ll llll ll ll llll ll ll ll ll ll ll ll llll ll llll ll ll ll llll ll llll llll ll ll ll ll ll ll ll ll ll ll ll llll ll ll llll ll ll llll ll ll llll ll ll llll llll llll llll ll ll llll llll llll llll ll ll llll llll ll ll llll llll ll ll llll llll ll ll llll ll llll ll ll ll llll llll ll ll ll ll ll ll ll ll ll ll llll llll llll ll ll ll ll llll llll ll ll ll ll ll ll ll llll llll llll ll ll ll ll ll ll llll llll ll llll ll ll ll llll llll ll ll ll ll llll ll ll ll ll llll ll ll ll ll ll ll ll ll ll ll llll ll ll llll ll ll ll ll ll ll ll llll llll ll ll ll llll llll ll ll llll llll ll llll ll llll llll llll llll ll llll ll llll ll llll ll ll ll ll llll ll ll llll ll ll ll ll ll ll llll llll llll ll ll ll ll ll ll ll ll llll ll ll ll llll llll ll ll ll ll ll llll ll ll llll ll ll ll ll ll llll ll llll llll ll llll ll llll llll ll ll ll ll ll ll ll ll llll ll ll ll ll ll llll llll ll ll ll ll ll ll ll ll llll ll ll ll ll ll ll llll ll llll ll ll llll ll ll ll llll ll llll llll ll ll ll ll ll ll ll ll ll ll ll ll ll ll llll llll ll ll llll llll llll llll llll llll llll llll llll llll llll ll llll llll lll lll ll llll ll ll llll ll ll llll ll ll llll ll ll llll llll ll ll llll llll llll llll llll ll ll llll llll ll ll ll ll llll ll ll llll llll ll ll llll ll ll llll llll llll ll llll ll ll llll ll llll llll ll llll llll llll ll ll llll llll llll llll ll ll llll llll llll ll ll ll ll llll llll l l llll llll llll llll ll ll llll llll ll llll llll ll llll ll ll ll ll ll llll ll llll ll llll ll ll llll llll l lllll llll llll llll llll llll llll llll llll llll llll llll llll llll ll llll ll llll llll ll llll ll ll llll ll llll llll llll llll llll llll llll llll llll llll llll llll ll llll ll ll llll ll llll llll llll llll llll llll llll llll llll llll llll llll ll llll ll ll llll ll ll llll ll llll llll llll llll llll llllllllllllllll llll llll llll lll ll lll lll ll lll lll ll lll lll ll ll ll lll llll llll llll llll llll llll lll ll lll llll lll ll lll lll ll lll lll ll lll llll llll llll llll llll llll llll llll llll lll ll lll lll ll ll ll ll ll ll ll lll llll llll lll ll lll llll llll llll llll llll llll llll llll llll llll llll llll llll lllll l llll ll llll ll llll llll ll ll llll ll llll ll ll llll llll l ll lll ll ll ll llll ll ll llll llll llll l ll ll l llll llll ll llll ll llll llll l ll lll ll ll ll llll llll llll ll llll ll llll l ll lllll l ll l llll llll ll l ll l llll llll llll llll ll ll llll ll ll llll llll ll ll llll llll ll ll llll llll ll ll llll l ll l llll ll ll llll ll ll llll ll ll llll ll llll ll llll llll llll llll llll ll lll l llll llll ll ll llll llll llll llll llll llll llll llll llll llll llll ll llll lll ll lllll l ll l ll ll ll llll llll ll ll ll llll llll l ll lll ll ll ll l ll ll ll ll ll lll llll llll ll ll l ll ll ll lll ll ll lll l l lll ll llll llll ll ll llll ll llll ll ll ll ll ll ll ll lllll ll lll llll ll ll llll llll ll llll ll ll ll lll lll ll ll ll llll l l l ll ll l ll llll lll ll lll lll ll lll ll l ll l llll ll lll ll llllll ll ll ll lllll ll lll ll llllll ll lll ll ll ll llll ll llll ll llll ll llll llll llll llll ll llll ll llll llll lll l l lll ll llll ll ll l ll l llll ll l ll l ll ll l ll lllll llll llll ll ll ll ll l ll l ll ll ll ll llll l l l lllll ll ll ll ll l ll l llll llll ll ll ll lll ll ll ll l llll ll ll ll ll ll ll l ll lllll ll lll ll l l l ll ll ll ll ll ll ll ll l ll l llll lllll ll lll lll ll l ll ll llll ll ll llll llll llll llll llll ll ll llll ll ll llll l ll ll l l l llll llll ll ll llll llll ll ll llll llll llll llll ll ll llll llll llll ll lll ll lllll ll ll llll l ll ll ll lll ll lll l l l llll ll ll ll ll ll ll ll lll ll l llll l ll lll ll l ll l llll llll ll ll lllll l l ll ll l llll l ll l llll ll ll ll ll l ll ll ll l ll ll l ll ll ll ll ll ll ll l llll ll ll ll lll ll ll l l lll lll ll l l lll ll ll lll ll ll ll ll ll ll ll l llll l ll l ll ll ll llll ll llll ll ll ll ll l l l lll ll ll ll ll llll llll ll llll llll ll ll llll ll llll llll lll ll ll ll ll ll l ll ll ll ll ll ll llll l l l ll ll l llll ll ll l ll ll ll l llll ll ll llll l ll l ll ll ll l ll ll l l lll ll llll ll ll llll l ll lll ll ll lll ll ll ll l llll l ll ll l l l llll llll ll ll l ll l ll ll llll ll lll ll l llll ll ll l ll l ll ll l l l l llll llll llll l ll l llll l ll ll ll l llll ll ll llll ll ll llll l l llll l l l lll ll ll ll llll llll l ll l llll llll l ll ll ll lllll llll l ll ll l l ll ll ll ll ll ll l ll ll llll lllll l l lll ll ll ll lllll ll ll ll ll ll ll l l ll ll l ll ll ll ll lllll ll ll ll l ll lll l l llllll ll lll llll l ll l llll l ll ll l l lll ll l ll ll ll lll lll l l llllll ll ll ll ll ll lllll llll l ll ll ll lll lll ll ll ll ll ll lll ll l ll ll ll ll ll lll lll ll lll lll ll ll ll lll ll llll ll ll l ll l l ll l ll ll ll ll llll llll l ll l l ll l llll ll lll ll lll ll l ll l llll ll ll l l llll ll ll llll ll ll l ll ll ll ll ll l llll ll ll ll ll l ll l ll ll llll llll l ll l ll ll ll ll llll l ll l ll ll l ll l ll ll ll ll llll l ll l llll llll l ll l llll llll llll ll ll llll l ll l llll llll llll llll ll ll l ll l llll l ll l llll l ll l llll ll lll ll l llll ll l ll l l ll l l ll l l ll l ll ll ll ll l ll l llll ll ll l ll l llll l ll ll ll l ll ll l ll lll lll ll l llll ll ll ll ll llll l ll l l ll ll ll l ll ll ll ll l ll l l ll l ll ll l ll ll ll l ll ll l ll l ll ll ll ll ll lll ll ll ll l ll ll l ll l l ll l llll l ll l llll l ll ll ll ll ll ll ll ll l llll ll ll l ll l llll ll lll ll l ll ll l ll l l ll l llll l ll l ll ll ll ll llll ll ll l ll l ll ll llll ll ll ll lll ll l ll ll l ll l ll ll llll ll ll llll llll l ll l l ll l ll ll ll llll ll llll l ll l llll l ll lll ll ll ll ll llll llll l ll l ll ll ll ll l ll l llll ll ll llll llll llll llll ll ll llll l ll l llll ll ll l ll l l ll l llll l ll l ll ll l ll l ll ll l ll ll ll ll ll l llll l ll lll ll ll ll l ll l llll l ll l l ll l ll ll ll ll l ll lll llll ll llll llll l ll lll ll ll ll l ll ll ll ll ll ll l llll ll lll ll lll ll ll ll l ll l l ll ll ll l ll lll ll ll ll l llll l ll ll ll ll ll l ll ll l ll l llll llll ll ll ll ll llll llll llll l ll l llll ll ll ll ll ll ll ll lll ll l ll ll l ll l llll llll llll ll ll llll l ll l llll l ll lll llll llll ll ll ll ll l ll l llll ll ll ll ll llll llll ll lll ll l ll ll llll llll ll ll llll ll ll llll llll l ll l llll llll ll ll ll ll ll ll ll ll llll l ll ll ll lll lll ll l llll llll ll lll ll l llll llll ll ll l ll ll ll lll ll ll ll ll ll l ll l ll ll llll l l llll ll ll l ll ll ll lll ll llll ll ll l ll lll lll ll l llll l ll ll ll l llll ll ll ll ll l ll l l ll l l ll ll ll ll ll ll ll l ll ll ll ll ll ll ll ll ll ll ll ll llll llll llll ll ll ll ll l ll ll ll l llll ll ll llll ll lll ll lll ll llll ll ll l ll l ll ll ll ll ll llll ll ll l ll l ll ll ll ll llll ll llll ll ll ll llll llll llll ll ll llll ll ll ll ll lllll l l lll ll llll llll ll ll llll ll llll ll ll ll l ll lll ll ll lll l l lll ll ll ll ll lll ll lllll ll ll llll ll lll ll lll ll ll ll llll ll ll ll lll l llll l ll l ll ll ll ll llll ll ll llll ll llll llll llll ll ll ll ll llll ll ll ll llll ll ll ll llll llll llll ll ll ll llll l ll lll ll ll ll ll ll ll ll ll lll ll lll ll llll llll l ll l llll l ll l ll llll ll llll ll ll llll l ll lll ll ll lllll ll lll ll ll ll ll ll ll ll llll ll lll l l lll ll llll ll ll llll llll ll ll ll ll llll llll llll ll ll llll ll lll l l ll l l lllll llll ll lll ll lll ll l ll l ll ll llll ll llll ll llll ll ll llll ll ll llll l ll lll ll ll ll lllll lll ll ll ll ll ll ll ll ll ll l ll l ll ll ll ll ll ll ll ll ll llll llll lll ll lll llll llll lll l l lll ll ll ll llll ll ll llll llll llll l ll l llll llll ll ll ll lll ll lll ll ll ll ll ll l ll l lllll l l lll llll ll llll ll ll ll ll l ll l llll ll llll ll llll ll ll ll llll llll l ll l ll ll l l l ll l l lllll llll llll ll ll ll ll ll ll lllll ll lll ll ll ll l ll l ll ll llll llll l ll lll ll llll l ll l ll ll ll ll ll ll ll ll llll ll ll ll ll ll lll ll ll ll l ll ll ll lll l l llllll lll ll ll ll ll ll llll ll llll ll l ll l ll llll ll llll ll ll lllll ll lllll ll ll l ll ll ll lll ll ll ll ll ll ll ll lllll l l lllll llll l ll l llll ll ll ll lll l l lllll llll l ll l ll lll l l lll ll llll llll ll ll ll ll ll ll l ll l llll ll lllll ll lllll ll lll ll l llll l ll l ll ll l ll l ll ll ll ll ll ll ll ll lllll l l ll ll lll ll l ll l ll ll llll ll lll ll lll ll ll ll l ll l ll ll ll ll ll lll ll lll ll llll ll ll llll ll lll ll lll llll lll ll lll ll l ll l llll ll ll l ll l l llllll l l lllll llll ll ll ll ll ll lll ll ll ll ll ll l llll ll ll ll ll ll ll ll lll l l lll ll llll llll ll ll l l l lll ll ll ll llll l ll l llll ll ll llll llll l ll lll ll lllll ll lllll ll ll ll ll ll llll ll ll l ll l ll ll ll lll l l ll ll l ll ll llll ll ll ll ll ll ll llll ll lll ll ll ll l llll ll ll ll ll ll ll ll llll lll ll lll llll ll llll lllll l l lll ll l ll l ll ll lllll ll lll ll llll ll lll ll l lllll l ll ll l ll ll l l lll ll llll llll ll ll ll lll ll lllll l ll l ll lll ll l ll ll llll ll ll lllll l l lllll llll ll ll ll lll ll lllll l ll l ll lll l l ll ll l ll llll ll ll lll ll ll ll l lllll l l lll ll ll ll ll ll ll lllll ll ll ll ll l l ll ll ll ll lllll ll lll l l ll ll ll ll lll lll ll l ll ll ll ll ll ll l ll l ll ll llll ll ll l ll l ll ll ll ll ll ll ll ll ll ll llll l ll l llll llll ll ll ll ll llll l ll l l ll l ll ll llll l ll ll l llll ll ll llll ll ll llll l ll l ll ll l ll ll ll l ll ll ll llll ll llll l ll l ll ll ll ll llll l ll l l ll l ll ll l ll l ll ll l ll l llll l ll lll lll ll ll ll ll ll ll ll l ll ll l ll l llll ll ll ll ll ll lllll ll ll ll l llll l ll l ll ll ll ll l ll l llll ll ll ll ll l ll l ll ll ll ll l ll l ll ll llll ll ll ll ll ll ll llll l ll ll ll l llll llll llll ll ll ll ll l ll lll llll ll l ll l l ll l ll ll llll ll lll ll ll l ll ll llll l ll lll lll ll ll ll lllll l ll l l ll l ll ll l ll l ll ll l ll ll l l ll ll l ll ll llll ll ll ll ll l ll ll ll lll ll ll ll llll ll ll ll ll ll ll llll ll ll ll lll ll lll llll ll ll ll ll l ll l ll ll llll llll llll ll lll l l lll ll ll ll llll l ll l ll ll ll lll ll lll llll ll llll llll ll ll ll ll ll ll ll lll l l lll llll ll l ll l ll ll ll ll ll lll lll llll ll llll ll ll llll ll ll ll lll ll ll ll l ll ll llll ll ll ll ll llll ll ll ll ll ll ll llll l l l l ll ll l ll l llll ll ll llll llll llll ll ll ll ll ll llll llll llll llll ll ll l ll l ll ll ll ll ll ll ll ll llll ll ll l ll l ll ll llll lllll ll lll lll ll lll llll ll ll ll ll ll ll ll ll ll llll l l l ll ll l ll ll ll ll ll ll ll lll l llll llll llll ll ll ll llll ll llll l ll l llll llll ll ll llll llll llll ll lll l l lll ll ll ll l ll lll llll lll ll lll ll llll llll ll lll ll lll ll llll llll lll l l l ll ll ll ll ll ll llll llll ll ll l ll l ll ll ll ll llll llll llll ll ll ll ll ll ll llll llll llll l l l lll ll llll l ll llllll lll ll ll ll lllll ll lll lll l l lll ll llll ll ll llll ll ll ll ll llll ll ll ll ll ll llll llll ll llll ll ll llll ll ll ll ll l ll lll ll ll llll l l l lll ll ll ll ll ll llll llll llll llll llll l ll ll ll lll ll llll ll ll ll ll ll ll ll ll ll lll ll lll ll llll ll ll ll lll lll ll llll ll llll ll ll ll l ll l llll ll ll ll ll llll ll ll ll ll ll ll llll l ll l ll ll llll llll llll ll ll ll ll l ll lll ll ll l ll l ll llll ll llll llll ll lll ll lll ll ll llll llll ll ll ll ll ll ll ll l ll l llll ll ll ll ll llll l ll l llll l l llll ll ll l ll l llll l ll l llll ll ll ll ll l ll lll lll ll ll ll l llll l ll l ll ll l ll l ll ll l ll ll ll l l ll l ll l ll l llll ll lll ll l ll ll ll ll llll ll ll l ll l ll ll ll ll llll llll l ll lll llll ll lllll ll ll ll ll l ll ll ll ll l ll lll lll ll ll ll l llll ll llll llll ll llll ll llll lll ll lll ll llll llll llll llll lll ll l ll ll ll ll ll llll llll lll ll lll llll llll ll lllll lll ll l ll lll ll ll ll ll lll ll l ll ll ll llll ll l ll l llll ll lll ll l ll ll ll l ll ll ll ll ll l ll ll llll ll ll ll ll ll ll llll llll llll l l llll ll llll ll ll ll ll ll ll llll ll llll ll llll ll ll ll ll llll ll ll llll llll ll ll ll llll ll lllll lll llll ll ll ll ll ll llll ll ll llll ll llll ll llll ll ll ll ll ll lllll lll ll llll ll llll llll ll ll ll ll lllll lll lllll A B llllllllll llllllllll ll llll ll lll lllll llll ll l llll lll llll ll ll lll llll ll lll ll lll ll ll llll llll ll llll ll ll llll lll llll l lll lll ll lll llll ll llllll lll lll lll l lll l lll lll lllll l ll ll l l ll ll ll ll l ll l llll ll l ll ll ll ll ll ll l lll llll ll lll lll ll ll ll ll lll ll ll ll ll llll lll lll ll llll lll l ll l llll lll lll ll ll llll l lll ll llll l ll l lll llllll lll l ll ll llll ll ll lllll l l ll l ll lll ll ll ll lll l ll ll ll ll l l ll lll ll lllll l ll lll l llll lll l lll lll l ll ll llll lll ll l lll l llll l l l lll lll lll l l ll llll l lll ll lllll ll ll lll l ll ll ll ll lll l lll ll ll llll lll lll llll ll l llll ll l lll l lll ll ll lll llll ll l lllll l ll lll llll l ll ll ll lll l l ll ll l llll l lll llll lll l ll ll ll lll ll ll ll ll l lll ll ll lll lll lll lll l ll lll ll ll ll ll lllll llll l lllll ll ll ll l ll ll lll ll l l lll ll lll lll ll ll ll l lll l ll ll llll l ll ll llll l lll lll lll l ll l ll ll l lll l lll l ll lll lll ll l ll lll lll ll l lll ll ll llll l lll l lll ll lll lll l lll ll ll l lll llll l llll lll ll ll ll lll l ll lllll l ll ll ll l ll lll ll l ll ll l lll lll lll ll llll l lll ll ll ll ll lll lll ll l lll l ll lll lll lllll l ll lll ll l ll l lllll l l l l lllll ll ll ll llll ll ll lll l ll llll lll ll lll lll l ll l ll l llll ll ll ll ll lll ll ll ll ll l lll ll ll l lllll l ll ll ll ll ll lll ll l lllll lll ll ll ll ll ll lll llll l l l l ll ll llll l lll ll lll ll ll lll llll ll l lll lll l llll ll l lll ll lll lll l l llll l ll lll ll l llll l lll ll l lll l ll lll l l ll ll ll lll l lll ll l ll llll ll ll lll lll lll l l llll l ll l ll ll ll ll lll llll l llll ll llll l ll llll l ll llll ll ll ll lll l ll ll l ll ll ll lll ll ll lllll ll lll ll ll ll ll ll l lll lll l ll lll lll ll ll ll l lll llll ll lll l ll l lll lll ll l ll lllll lll ll ll ll ll l ll l llll l ll lll llll ll ll llll l l l ll l ll ll l ll l ll lll ll l ll ll lll l llll ll l l ll llll lll l ll ll lll l lll ll ll l l ll ll ll ll l ll lll lll l lll ll lll l lll l lll l ll ll lll l lll l lll ll l ll ll ll llll ll lll l l ll ll l lll l lll ll l ll ll ll ll ll l lll ll lll l ll lll l ll ll ll lll l lll l ll l lll ll ll lll l lll ll ll ll llll ll lll lll l lll l ll ll l lll l ll ll lll ll l llll l ll lll lll l l lll l ll ll ll ll l ll llll lll ll lll ll ll ll l ll lll ll lll ll l ll ll lll l lll ll l ll ll l ll lllll lll ll l lll ll l lll l l l lll ll ll l lll lll ll l ll ll l lll lll l lll ll ll ll lll lll ll ll ll l ll l llll l ll lllll ll lll lll ll ll lll l ll ll l ll ll l ll l ll ll lllll lll l llll lll lll lll ll ll ll lll lllll l l llllll ll ll lll llll lll l ll lll ll ll ll ll l lll lll ll ll ll lll ll l lll l lll ll l l ll l lll lll ll lll ll ll ll l ll ll lll l lll l lll lll ll lll l lll l ll l ll l ll llll l ll ll lll lll ll l ll ll llll ll lll lll lll l l lll ll llll l l llll ll ll ll lll ll l lll lll ll ll ll lll l llll llll l ll lll l ll lll l ll lll l l ll ll l ll lll llll lll l lllll l ll ll lll l lll llll ll ll lll l lll llll ll ll l ll ll l l ll l lll lll l ll ll ll l lllll l lll lll l l lll llll l ll ll ll l lll lll l llll lll ll ll llll l lll l ll ll ll l l ll lll lll ll ll lll ll lllll llll l ll ll ll lll l llll l ll ll lllll ll ll lll l lll llll ll ll ll ll l lll ll ll ll l ll lll l ll ll llll ll lll lll l ll lll lll ll ll lll lll ll ll l llll lll lllll ll ll l lll ll ll ll l lll ll l lll lll llll ll ll lll l ll ll lll l llll lll l ll l ll ll l ll ll ll l lll ll llll l l ll lll ll llll ll l lll l lll ll lll lll ll ll l llll ll ll l llll ll llll ll ll ll l l ll lll l ll lll l ll ll ll ll llll ll ll llll ll ll ll ll lll ll l ll llll lll l ll ll l ll ll l lll l l l ll ll ll l ll lll lllll ll ll l llll ll lll lll l ll lll l ll lll lll ll lll ll llll l ll lll lll ll ll ll l ll ll ll l l ll lll l l lll ll l ll lll l lllll l ll l ll l ll ll l lll ll ll l ll ll l ll ll ll ll ll ll l ll lll ll l llll lll l l ll lll ll ll l ll lll ll ll ll ll l ll llll l lll l l ll ll llll l l llll lll l ll ll lll l ll ll l ll lll lll ll ll lll l l l llll ll ll l ll llll l llll l lll ll ll l llll l ll l lll l l l ll ll lll ll l ll lll l lll l llll l lll ll ll ll ll l lll lll lllll l lll ll ll l lll lll ll l lll l ll ll llll ll ll l ll ll l ll ll l l lllll ll l lll llll ll ll l l ll ll l lll l l ll ll l lll llll l lll l ll ll ll ll l ll lll l l ll ll ll ll l ll l ll ll ll l lll l ll l l lll lll lll lll ll l llll llll lll lll l lll ll lll ll l ll ll ll ll lll ll lll l ll ll ll l l l llll ll l ll lll ll l ll l l ll ll l l llll ll ll ll ll ll lll l lll lll ll ll ll ll ll l ll lll l lll l lll l lllll l ll l llll l l ll ll ll ll l lll lll ll ll l ll lll ll lll ll lll l lll ll l ll ll l ll lll l ll l ll ll l ll l lll ll ll l lll l ll ll ll ll lll lll lll l ll llll l lll l ll l lll lll l ll ll ll l l ll l l lll l l ll l l ll ll ll ll ll lll ll ll l ll l l lll ll l ll lll l ll l ll lll ll l ll lll ll lll l ll l lll llll l l lll ll ll lll ll lll ll l l lll lll ll lll lll l ll l ll l lll l lll l lll l ll ll llll lll llll ll l lll l ll l l l lll l ll l ll ll ll ll l l llll l l ll l l ll l lll l lll lll ll ll ll ll ll lll l l l ll l l lll ll l l ll l lll lll lll llll lll lll l ll lll ll lll l ll lll l ll lll ll ll ll ll ll ll ll l ll ll l l l ll lll l lll lll lll llll l ll lll lll ll lll lll ll l ll l lll ll llll llll ll ll ll ll lll llll l lll l l lll ll ll l llll l ll ll ll l l lllll ll ll l lll ll llll l lll llll llll ll l ll ll l lll ll ll ll lll ll l lll lll ll lll ll ll l ll l llll ll ll l l ll ll ll lll l l lll lll l ll ll lllll llll l lll ll llll l l l lll l lllll ll llll ll ll lll l l lll l lll l ll ll l l lllll ll l ll lll llll l ll l ll ll l ll l lll ll lll lll lll l llll lll ll ll llll ll ll lll ll ll l l l ll ll l l ll ll ll l ll lll lll lll ll ll lll ll lll lll l lll l l llll l ll llll lll ll l l ll lllll llll llll lll llll ll ll ll ll llll ll ll lll ll ll ll l ll l ll ll ll lll ll lll l ll lll l ll l llll ll lll ll lll ll ll l lll l ll ll ll ll l ll l llll l ll ll l ll lll ll ll l ll ll llll ll lll lll l lll lll ll llll lll lll l ll l ll ll lll ll l ll l llll lll l llll ll lll l llll ll lll ll l lll l l lllll l ll ll l lll l ll lll ll l lll l lll ll ll lll l l lll lll lll ll lll l ll ll llll l lll ll l lll lll l l ll ll ll ll ll l l llll lll l lll l lll ll ll lll ll ll l ll l ll l ll ll l ll l llll lllll ll l ll llll l lll ll llll ll ll ll ll lll l llll lll ll l ll ll ll ll ll lll ll ll ll ll lll llll ll l ll llll ll l ll l lll lll lll l ll lll ll ll lll ll ll ll lll l ll ll lll ll ll l l ll lll ll l lll lll ll l llll l ll ll ll ll llll ll ll llll lll l l ll lll ll ll lllll ll l l ll lll ll ll ll ll l ll ll lll ll lll ll ll ll ll ll ll lll lll l l l ll lll l l l lll lll lll ll ll lll ll l lll l ll llll l l ll l lll l ll lll l lll l ll l lll l ll lll ll l ll ll ll ll lll ll lll ll l ll lll lll l ll ll ll l ll ll lll ll lll l ll ll lll ll llll ll ll l ll ll lll l l ll ll ll llll l ll lll ll l l ll ll lll l lll ll l lll l ll ll l ll l l llll l ll l l lll lll lll l l lll ll lllll ll lllll l l ll ll lll ll llll lll lll l ll l ll l l lll l ll ll ll l ll lll ll ll ll ll ll ll ll lll ll lll ll llll lll lll lll lll l l lll l l lll l llll l ll ll ll ll llll lll lll lll ll llll l llllll l lll llll l l lllll lll lllll lll ll lllll ll ll ll l ll l llll ll l lll ll lllll l lll ll ll l ll lll l ll l ll ll l ll ll l l ll llll l l ll l llll l ll ll ll ll ll ll ll ll llll l l llll l llll l l ll lll ll llll lll l lll ll ll lllll llll ll lll ll ll lll ll ll ll lll ll l ll ll ll lll llll lll ll l ll l llll l ll l llll ll ll ll ll l lll l lll ll l ll ll llll l ll l llll l llll l ll ll ll l ll ll ll llll ll llll l ll lll ll l ll lll ll ll lll lll ll l ll ll l ll lll lll lll lll ll llll ll ll ll lll ll ll lll ll lll llll ll ll llllll l ll lll lll lll ll ll ll l llll l lll lll llll l ll l ll llll l lll l lll l ll l llll ll lll lll l ll ll l l ll ll ll lll l ll llll lll ll ll ll l llll ll llllll ll l ll l ll lllll ll l l lllll l ll ll ll lll lll ll ll l ll llll l ll ll ll l lllll ll ll ll ll ll l lll l l ll l ll ll l llll ll ll ll l l ll ll lll ll ll ll lll l llll ll ll l llll l ll lll ll lll l ll lll lll llll l ll ll ll ll lll llll l ll l llll l ll ll llll ll ll ll lll l lll ll lll lll ll lll ll l llll llll l ll ll llll lll lll lll ll l ll l ll llll l lll lll ll ll lll l lll ll ll ll ll lll ll lll l lll l lll lll l l lll l lll ll l ll l ll ll l lll ll ll ll llll ll lll l ll l l ll ll l ll ll lll l lll ll lll llllll llll ll llll lll l lll l ll ll llll ll l lll l ll l ll l lll lll ll ll l llll l lll l l ll lll l lll l lll ll lll l l ll ll llll ll ll ll ll lll l lll ll ll ll ll lll l ll ll l l ll ll ll llll ll l ll lll lll l ll ll ll l lll ll ll lll ll l ll lll lll ll lll lll ll l ll ll ll ll lll l lll ll lll lll ll l lll lll l ll l lll l l ll ll lll ll l lll ll lll ll l ll l lll ll l lll lll llll l l l ll ll ll l ll lll ll l ll l lll l ll ll l llll ll ll lll ll l lll l ll ll l lll lll ll lllll ll l llll l ll l l lll lll lll l l l lll ll l ll l l ll ll l ll ll llll lll l lll ll ll l ll lll ll llll l ll lllll l ll lll ll l l lll ll ll l ll ll lll l ll l llll lll l ll ll llll ll lll lll lll lll l lll l lll l l ll ll ll ll llll l lll ll l llll lll ll ll ll lll ll l ll lll lll ll ll lll ll ll l ll ll l lll ll l ll lll l ll l ll lll ll llll ll ll ll l ll lll lll ll l llll ll ll l ll l ll ll l lll lll ll ll ll l ll ll ll ll llll lll ll lll ll lll l lll l llllll lll llll lll llll l ll lll ll lllll l lll lll l lll ll ll ll lll ll lll ll ll ll lll lll lll l lll ll ll ll l lllllll l l l l lll lll lll lll l ll l llll lll ll lll ll ll ll ll lll l l lll l ll l llll l ll ll l lll l ll lll lll lll lll lll ll ll llll lll ll lll ll ll ll lll l llll llll lll ll ll ll lll ll ll lll ll llll ll lll ll l ll llll ll lll lll l ll llll l lll ll l lll l ll llll lll ll ll lll ll l ll ll lll l ll ll lll l ll ll lll ll ll lll ll lll l l ll lll ll ll llll ll ll ll ll ll ll l ll llll l ll ll ll lll l llll lll l lll llllllll ll lllll llll ll ll l llll lll llll ll ll ll lll llll ll llll lllll ll llll lll ll lll ll lll l llll lll ll ll lll lll lll lll llll l lll llll l lll llll lll ll lll lll lll lll ll lll llll llll ll lll lll lll lllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllllll lllllllllll llll ll llll llll ll l lll ll ll l lll ll l lllll lll l lll lll ll lll lll ll llllll ll l lll llll lll llll lll l ll lll llll l llll llll l ll llll llll l l lll ll llll llll lllllll lll l lll ll lllll llll llllll ll l lll l ll l llll l ll ll ll lll l l ll lll ll ll llll ll lll l ll ll ll l ll llll l ll lll ll ll ll l ll ll lll l ll ll lll l ll ll lll lll ll ll ll l ll l lll lll l lll ll llll ll ll llll ll l ll ll l lll lll ll l ll lll lllll lll ll ll lll ll ll ll l ll l llll llll l ll ll ll ll lll ll l ll l lll ll ll lll lll lll lll llllll l lll ll lll ll lll lll l llll lll ll ll llll l ll ll l ll l l lll lll l ll lll lll l llllll ll lll lll ll ll ll l lll lll lll l ll ll ll ll l ll lll llll ll ll l ll llll l lll ll l ll ll ll ll ll ll ll lll ll ll lll llll lll l lll lll ll l lll ll ll ll ll ll ll ll lll ll ll ll lll l lll ll llllll ll ll l l ll llll lllll lll ll ll ll l l ll ll l ll lll lll lll lll ll l lll ll lll l l ll l llll lll ll l ll lll ll l ll ll ll ll l lll l lll ll l llll l ll ll ll ll lllll llll lllllll lll lll ll ll l l ll ll lll l lll l lll lll l ll ll lllll l l l lll lll l ll lll l l l ll lll ll ll ll l ll lll ll lll ll ll ll l lll ll l ll ll llll lll lll lllll l ll l lll llll ll ll l lll ll l ll llll l lll lll ll lll l lll ll lll ll ll ll ll l ll lll l lll lll ll lll llll l ll lllll l l ll l ll l ll l ll l lll lll ll ll l ll l lll ll llll l llll l lll lll l lll ll lll l ll l ll ll llll l ll ll ll ll lll ll l ll l ll ll l ll lll lll ll lll ll ll lll l lll l lll llll ll ll lll ll ll ll ll llll lll ll l ll ll ll ll ll l lll l ll ll llll lll l ll ll ll ll lll lll lll l lll l ll llll ll ll l l lll ll ll l ll l ll ll ll lll llll ll ll llll lll l lll ll ll l llll ll ll l ll l ll l ll ll l lll lll ll lll l l llll lll l ll ll l l llll ll ll l lll lll ll lll lll ll ll llll l ll l lll l lll lll ll l llll l lll ll l lll l ll ll lll ll l lll ll ll lll lll lll lll lll ll l l lll l llll l ll l l lll ll l llll l ll l lll ll ll ll ll ll ll lll l ll ll ll lll lll llll llll ll ll ll ll lll lll l l lll ll llll llll l ll lll l llll lll l ll llll ll ll ll ll ll ll l lll ll ll l ll l l ll lll lll llll l ll l ll ll l l ll lll l l lll ll ll ll ll l ll ll lll ll l ll l ll ll ll llll l llll ll ll l lll lll lll ll l ll lll ll l l llll l ll lll lll lll ll ll ll ll ll llll ll ll lll ll ll lll l lll lll lll ll lll llll ll lll l ll ll l ll ll ll l l llll ll ll ll ll l ll l ll ll lll l l ll ll lll ll ll ll ll l ll ll ll l ll l ll l lllll l lll ll lll ll ll ll lll ll ll l lllll l lll lllll l ll lll l ll ll ll l ll ll l l lll l ll llll ll ll ll l l ll ll ll lll ll lll l ll llll ll ll l l llll lll llll ll ll l lll l l ll l llll lll ll lll l l lll ll ll ll ll lll l ll ll lll ll ll lll ll l lll l l l lll ll l l lll l lll l ll lll ll lll ll llll ll llll l l ll ll ll l l lll l ll ll llll lll ll lll l l ll llll ll lll ll lll llll ll lll ll ll l ll ll ll ll ll l llll l ll ll ll ll ll ll ll l l l ll ll l lll ll lll lll ll llll l l ll l lll ll l lllll l ll ll l ll l ll lll l lll ll ll ll llll ll lll lllll llll l lll lll ll l ll l l ll ll l ll lll lll ll l l lll ll ll ll llll ll ll ll lll l llll ll ll ll llll ll l l lllll ll ll ll ll l ll ll l ll l l ll ll l llll ll ll lll llll ll ll l llll lll lll lll lllll l ll ll l ll l llll l ll ll l ll ll ll ll l ll lll l llll l lll lll l lllll l lll l lll l llll lll lll ll ll ll lll l lll ll lll l ll l lll l ll ll ll ll ll ll ll ll ll ll lll lll l lll l ll lll l lll lll ll lll lll l lllll l ll llll l ll lll l llll ll ll lllll l ll l ll ll ll ll lll l lll lll l ll l ll l ll ll ll l l ll l ll lll ll ll l l ll ll ll l ll llll l lll l lll l lll l ll ll lll l llll llll ll lll llll ll l ll lllll l ll l ll ll ll ll ll lll lll l ll llll l ll llll ll l lll lll ll l lll ll ll lll l l l ll lll ll lll l ll l ll ll llll ll lll lll lll l lll l lll l ll ll ll l lll ll l llll l lllllll l l l lll l lll ll ll l lll lll ll ll l ll lll lll ll lll lllll ll l l ll l ll lll ll lll lll lll ll ll ll l llll l lll l lll l llll l ll l ll l llll lll ll lll lll l lll lll lllll l lll ll lll ll l lllll lll ll l ll ll l l lll ll lll l l lll ll ll l lll ll l ll l llll lll lll l ll llll l ll ll l ll l lll l ll l llll l l ll l ll ll ll ll l lll ll lll ll l l lll lll ll ll ll llll lll ll ll lll ll l ll ll l ll ll ll ll ll lll l ll ll l lll l lll l ll l lll ll ll l ll ll lll l ll llll ll ll l llll ll l l lll l ll ll l ll llll lll lll lll lll l ll ll ll lll l ll ll ll l l ll ll lll ll llll ll lll ll lll lll llllll ll lll ll lll l ll l ll ll l ll ll ll lll lll lll l llll llll lll ll lll lll l lll lll lll ll ll l llll ll lll llll l llll l ll ll ll l l l lll l l l lll lllll l l lll ll ll ll ll lll l l ll ll lll lll l llll l l ll l ll llll llll l ll ll lll lll l lll l ll ll ll ll ll ll l lll l ll llll llll l ll l l ll ll lll lll ll ll ll l lll ll ll l l l lllll lll lll l lll lll ll ll lll l lll l l ll llll ll lll ll l lll l l ll ll l lll lll ll l ll l ll ll lll l ll lll l l ll lll l ll l lll l l llllll l llll ll ll ll ll ll ll ll l ll ll l lll lll l ll ll l ll ll ll ll l lll l lll lll ll l ll ll ll ll l l ll ll lllll ll lll ll ll ll llll ll lll l ll lllll llll l l llll ll llll lll ll ll lllll llll lll l ll l lllll l l ll ll lll l lll l llll lll lll lll l ll lll ll l ll l llll ll l ll ll l ll ll ll l llll ll ll llll lll l ll ll lll ll l ll ll l lll lll ll ll ll ll l llll lll ll ll l ll ll ll lll ll lll llll l llll llll ll llll l lll llll lll lll l l lll lll l ll lll l ll ll ll l llll ll ll ll ll l lll lll ll lll lll lll lll ll llll ll lll l ll lll ll lll ll lll l lll ll ll ll ll l ll lll lll l ll ll lll l l l lll llll ll ll l ll lll ll ll ll ll ll lll llll l ll l l ll ll ll ll ll llll ll l lll lll l ll ll lll lll ll lll lll ll lll l ll l ll ll ll l lll lll ll ll ll l l lll lll l ll ll ll ll lll ll l ll ll lll ll llll lll l lll ll ll lll lll lll l ll llll l ll ll l ll ll l ll ll l lll l l ll l lll ll llll l lll llllll ll lll l ll l lll ll ll lllll ll ll l lllll ll ll l l ll ll lll l lll lll ll llll ll lll llll l lll lll l lll ll ll ll lll lll ll lll lll l l ll lll ll lll l ll llll l lll lll l l lll ll ll llll l l lll l ll ll ll ll l ll l lllll l ll llll l ll l ll lll ll lll ll l ll lll l ll ll llll ll ll ll llll llll l lll lll lll l ll ll ll llll ll lll ll llll l ll l llll ll lll lll llll ll lll l lll ll ll ll lll l lll l ll ll lll lll l l l lll l ll lll ll ll lll l ll lll l ll ll lll ll ll ll l lll l ll lll l ll ll l ll lll ll l ll ll ll lllll lll l llll l l lll llll l l lll lll lll ll ll lllll ll ll ll lll llll l lll l ll l lll ll l ll ll l lll ll l ll l ll ll l lll ll ll l lll l l ll lll l l ll lll lll lll l l ll ll ll ll ll l ll l l lll ll lll ll l lll l llll l lll llll ll ll l ll llll l l ll ll l lll l l ll lll lll ll lll l ll l l lll l ll ll lll lll ll ll l ll l lll lll lll l lll l lll ll l lll ll lll l l lll lll l ll llll ll ll l lll l ll l ll l ll lll l ll l lll ll l ll ll lll l ll lll ll lllll lll ll l ll ll ll l ll llll lll ll ll l ll ll ll ll l l l lll l llll ll ll ll l ll l ll ll lll lll l ll l ll lll llll ll lll ll l ll ll llll lll l ll l l ll ll lll lll ll ll lll l l ll llll l llll lll l ll ll l ll ll lll lll ll ll lll l ll lll lll ll l ll lll l ll ll lll l ll ll lll ll l ll ll ll l l llll ll l lll ll ll ll ll ll ll l l ll ll l ll llll ll lll l lll l l lll l ll ll ll ll l ll ll lll l lll ll l llll l llll l lll lll llll l ll llll ll lll lll l l ll lll l lll ll ll llll l ll lll lll ll ll lll l ll ll l ll l lll lllll ll ll lll ll ll ll l ll lll ll l l ll lll ll l llll l lll ll l lll lll ll l ll l lllll llll l ll l llll lll l lll l ll ll llll ll lll l ll l ll l ll llll l ll lll ll l ll l llll l ll ll l ll ll lll lll ll lll ll ll ll ll ll ll l ll lll ll l ll ll lll l llll ll l ll llll l ll lll ll ll ll l lll ll l l lll ll ll ll ll lll lll lllll lllll l lll ll l lllll llll ll llll l ll lll lll ll l lll ll lll ll l l lll lll ll ll ll lll lllll l l ll l ll lll ll l lll lll l lll ll lll ll ll l l llll ll l ll ll l l ll ll ll ll ll ll llll l l ll ll l l ll llll ll l lll ll l ll ll l lll lll l lll lll lll l ll ll ll lll lll ll l ll l ll ll ll ll llll l l ll ll ll ll lllll l lll ll lll ll ll ll lll lll ll l ll l ll ll lll lll l lllll lll ll ll l lll l ll ll l llll l ll lll ll ll llll l ll ll ll l llll l ll llll ll l ll lll ll llll lll lll lll lll ll lll llllll ll l lll l ll lll lll ll l l l lllll ll lll l lll ll l l lll ll l ll l ll l lll lll l lll ll lll l lll ll lll ll ll l lllll l ll ll ll lll lll lll l llll ll ll ll ll ll lll ll l llll ll lll lll ll ll ll l lll llll ll llll l l ll lll ll ll l l ll ll l ll llll lll lll ll llll ll lllll ll ll ll l ll llll ll l lll l ll ll ll llll lll lll l lll ll l lll l l lll ll l ll lll l lllll llll ll ll llll lll l llll ll ll ll lll l ll l ll ll ll ll ll l llll l lll l ll lllll ll ll l ll llll ll ll ll l ll l ll lll ll lll ll llll ll lll l ll lll ll ll llll ll ll l lll lll ll ll ll ll l l lll lll llll ll l ll ll l ll ll llllll ll ll lll l ll ll ll llll lllllll ll l lll lll l ll llllll ll lll lll ll lll l lll ll l ll lll l lll ll ll l lll l llll l lll lll l ll lll lll ll l lll ll ll ll lll l l ll ll ll l llll ll ll l ll l l llll lll llll ll ll l lll lll l ll lll lll ll ll l lll ll ll ll ll ll lll l lll lll l lll l ll lll ll ll l lll l ll l llll l lll lll llll lll ll llll l l lll lll l lll ll ll lll ll l llll ll lll ll l ll l l lll ll ll llll lllll l ll l ll ll ll ll l lll l ll ll l l ll ll l lll l llll ll llll ll lll l ll l l lll ll ll ll lll lll ll lll l ll lll llll l ll l ll l lll l l ll lll llll l llll llll ll l ll ll ll llll llll lll l lll ll l lllll l lll l ll lll ll l lll lll ll l lll ll ll ll ll ll l l lll ll llllll ll llll l ll ll ll llllll l llll lll lll ll l l ll ll ll l lll l llll ll ll lllll l llll llll lll ll lll l ll ll ll lll l ll ll lll ll ll lll lllll l l ll lll lll ll ll ll lll ll l ll lll lll lll l lll ll l lll ll ll ll ll l ll lll lllll llll ll l lll ll l lll ll l lll l lll llll lll ll ll ll ll lll ll l ll l l ll l lll ll llll lll llll ll ll lll lll lll ll ll llll lllll llll llll llll llll ll llll lll ll ll l ll ll lll llll lll l llll ll l llllll l lll ll lll lll ll ll lll ll lll lllll llll lll ll lll l lll ll ll lll llll lll lll ll llllll llll ll lll llll lllllllllllllll
A B
Figure 3: Transforming unit ball B D (1) to sphere S D .Girolami focus on special probability distributions which give rise to particularlynice Riemannian geometries. In particular, the examples under considerationdescribed in section 4 allow for closed-form solutions to the geodesic equation,which can be used to reduce computational cost of geometrically motivatedMonte Carlo methods.While the proposed splitting algorithm is quiet interesting, we initially doubtedits impact since Riemannian metrics with closed-form geodesics are extremelyrare. However, we are now convinced that this approach will likely see ap-plication beyond what is outlined herein. For example, we believe that thisapproach can be used to improve computational efficiency of sampling algo-rithms when the parameter space is constrained. The standard HMC algorithmneeds to evaluate each proposal to ensure it is within the boundaries imposedby the constraints. Alternatively, as discussed by Neal (2011), one could modifystandard HMC so the sampler bounces back after hitting the boundaries. InAppendix A, Byrne and Girolami discuss this approach for geodesic updates onthe simplex.In many cases, a constrained parameter space can be bijectively mappedto a unit ball, B D (1) := { θ ∈ R D : (cid:107) θ (cid:107) = (cid:113)(cid:80) Di =1 θ i ≤ } . Augmentingthe parameter space with an extra auxiliary variable θ D +1 = (cid:112) − (cid:107) θ (cid:107) , wecould form an extended parameter space, ˜ θ = ( θ, θ D +1 ) so that the domain ofthe target distribution changes from unit ball B D (1) to D -Sphere S D = { ˜ θ ∈ R D +1 : (cid:107) ˜ θ (cid:107) = 1 } , T B → S : B D (1) −→ S D , θ (cid:55)→ ˜ θ = ( θ, ± (cid:113) − (cid:107) θ (cid:107) ) (1)Sampling from the distribution of ˜ θ on S D can be done efficiently using theGeodesic Monte Carlo approach, which allows the sampler to move freely on S D ,while its projection onto the original space always remains within the boundary.This way, passing across the equator from one hemisphere to the other will beequivalent to reflecting off the boundaries as shown in Figure 3.Our last comment is related to the embedding procedure discussed in Section3.2. We wonder if such embedding and the resulting extra step for projectioncould be avoided by writing the dynamics in terms of ( q, v ) in the first place9nd splitting it as follows: (cid:26) ˙ q = 0˙ v = G − ∇ log π H ( q ) (cid:26) ˙ q = v ˙ v = − v T Γ v (2)where Γ is the Christoffel symbol of second kind. The second dynamics in (2)is regarded as the general geodesic equation:¨ q + ˙ q T Γ ˙ q = 0 (3)The first dynamics in (2) is solved in terms of ( q, v ) in a more natural way: q ( t ) = q (0) and v ( t ) = v (0) + tG ( q ) − ∇ q log π H ( q ) (cid:12)(cid:12) q = q (0) (4)This way, we avoid the additional projection step and have v ( t ) ∈ T q ( t ) M aslong as v (0) ∈ T q (0) M . This also serves to isolate what seems to be the key pointin this work, which is not that the dynamics are taking place on an embeddedmanifold, but that they are taking place on a manifold whose geodesics areknown explicitly . With this viewpoint the applicability of the ideas of this papershould be further expanded. Comment
Daniel Simpson The basic idea of simulation-based inference is that we can approximately calcu-late anything we like about a probability distribution if we can draw independentsamples from it. This means that we can use sampling to explore the posteriordistribution and it turns out that the quantities we compute will usually havean error of O ( N − / ) if they are calculated from N samples. Unfortunately, inalmost any realistic situation, we cannot directly simulate from the posterior,however the remarkable (and their ubiquity really shouldn’t detract from justhow remarkable MCMC methods are) Markov Chain Monte Carlo idea says thatit’s enough to take a chain of dependent simulations that are heading towardsthe posterior distribution and use these simulations to calculate any quantitiesof interest. The variance in the estimators still decay like O ( N − / ) and theypretty much always work eventually. (There is, of course, an entire world ofdetails being suppressed within the world ‘eventually’.)The problem with vanilla (Metropolis Hastings) MCMC methods is thatthey are slow. It’s fairly easy to see why this is true: whereas perfect MonteCarlo methods ‘know’ enough about the posterior to produce perfect samples,Metropolis Hastings algorithms only require the ability to calculate ratios of theposterior density. For simple models, this may not be a problem, but as theposterior distribution becomes more complicated, it’s fairly straightforward toimagine the the efficiency of schemes based on simple proposals will plummet.Byrne and Girolami consider the even more complicated situation where thenatural parameters of the model have a non-linear structure. These type ofmodels arise frequently in ecology. A simple example occurs when modelling Department of Mathematical Sciences, Norwegian University of Science and Technology,N-7491 Trondheim, Norway. Email:
[email protected] R d , the onlyrequirement is that each point in the parameter space is associated in a smoothway with a symmetric positive definite matrix. In this case, it makes sense forthese matrices to be built from local approximations to the posterior distribu-tion and the whole scheme can be easily described without ever appealing tothe slightly intimidating notion of a manifold.The case considered by Byrne and Girolami is different. Here the parameterspace isn’t flat and the notion of a manifold becomes essential to defining good11nference schemes. The methods considered by Byrne and Girolami are differentfrom the geometrically simpler models considered by Girolami and Calderhead(2011). Rather than introducing a geometric structure in order to better explorea distribution on R n , Byrne and Girolami use the natural geometry of theparameter space to construct a proposal. It is unsurprising that this strategyresults in efficient MCMC schemes: it is almost universally true that numericalmethods that are consistent with the underlying structure of the problem aremore efficient than those that aren’t!That is not to say that the extra efficiency from using the problems naturalmanifold structure comes for free. Hamiltonian Monte Carlo methods are basedon the approximate integration of Hamilton’s equations, which are symplecticordinary differential equations in position and momentum space. Integratingsymplectic ODEs is an active field of research and actually implementing theseintegrators can be quite challenging. In particular, the HMC method proposedby Girolami and Calderhead (2011) requires, at each step, the solution of anon-linear system of equations, which can cause the manifold HMC proposalto catastrophically fail if it is programmed incorrectly. Fortunately, Byrne andGirolami show that when the parameter space is an embedded manifold, it ispossible to use a much simpler integrator. In order for their splitting techniqueto be applicable, it is necessary to have an explicit expression for geodesic flowon the parameter manifold and, in the cases considered in the paper, this exists.Given an explicit form of the geodesic, one only has two choices left: the stepsize (cid:15) and the number of steps N in each proposal. The performance of HMCmethods are known to be very sensitive to these parameters, however recentadvances in (non-manifold) HMC suggests that it is possible to adaptively selectthese in an efficient manner (Hoffman and Gelman, 2013).As Byrne and Girolami have focused on building HMC methods on embed-ded manifolds, it is instructive to examine the barriers to similarly generalisingthe manifold MALA schemes. Recall that MALA-type methods on R n are bi-ased random walks that propose a new value θ ∗ by as θ ∗ − θ ( k ) ∼ N ( µ ( θ ( k ) ) , H ( θ ( k ) ) − ) , where the specific forms of µ ( · ) and H ( · ) are irrelevant to this discussion. Theproblem with generalising this type of proposal to a manifold is obvious: thesubtraction operation does not make sense. One way around this problem isto take a lesson from the optimisation literature and note that we can makesense of this proposal using tangent spaces and exponential mappings (or, moregenerally, retractions)(Absil, Mahony, and Sepulchre, 2009). In this case, wepropose θ ∗ = R θ ( k ) ( p ( k ) ) , where R θ ( k ) ( · ) : T θ ( k ) M → M is a retraction map and p ( k ) ∼ N ( µ ( θ ( k ) ) , H ( θ ( k ) ) − )is a random vector in the tangent space T θ ( k ) M (Absil, Mahony, and Sepulchre,2009). The problem with this proposal mechanism is that it is not obvious howto compute the proposal density, which is required when computing the accep-tance probability. Hence, there is no clear way to design a MALA-type schemethat respects the non-linear structure of the parameter space.12 ejoinder Simon Byrne and Mark Girolami We would like to thank all respondents for their interesting comments, whichclearly identify exciting areas for further investigation.Both Kent and Dryden highlight recent developments in rejection samplingmethods for obtaining independent samples form distributions on manifolds.Such methods are obviously preferable when available, however as mentionedin section 5.1, the danger being is that rejection-based techniques can haveexponentially low acceptance rates, particularly in higher-dimensional problems.Indeed the impressive results of Kent, Ganeiber, and Mardia in avoiding thisproblem by obtaining constant lower-bounds of the acceptance rates highlightsthe importance of considering the underlying geometry of the manifold.Pereyra and Simpson point out the many links with optimisation: indeedoptimisation over manifolds has a rich history, and there is a wealth of literaturewith many interesting algorithms. However, as Simpson points out, many ofthese algorithms are based on projection operators, and thus we face what couldbe described as the ”Curse of Detailed Balance”: the difficulty of computingof the reverse proposal, which is required for the evaluation of the acceptanceratio to ensure we are targeting the correct invariant density. Hamiltonian-basedmethods are able to exploit symplectic geometric structure—namely reversibilityand volume preservation—in a manner that makes this almost trivial,We are very excited to see that Shahbaba, Lan and Streets have had successwith this methods. We agree entirely with their point that it is the explicitgeodesics, and not the embedding, which makes this method successful: ourreason for using the embeddings is that in all cases we identified, the embeddingsproved convenient to work with. Our reason for using the projection is that thisis typically of lower computational cost than inversion of G .As several commenters point out, despite its long history, remarkably little isknown about the theoretical properties of the HMC algorithm, especially whencompared to say Gibbs sampling and Metropolis–Hastings algorithms based onrandom-walks and Langevin diffusions. In particular one open question is theoptimal tuning of the step-size and integration length parameters. Unfortu-nately HMC is not readily amenable to the usual probabilistic tools, such aslinks to diffusions, due to the precise property that makes it so powerful: theability to simulate long trajectories and make distant proposals. This is an openquestion, attracting interest from numerous researchers.The paper by Wang, Mohamed, and De Freitas (2013) propose an empiricalBayesian optimisation approach, but this comes with significant overhead inobtaining sufficient samples on which to base the objective function, and pro-vides little insight into theoretical behaviour. We think that future advanceswill perhaps require a larger set of tools, such as exploiting the rich geomet-ric structure and elegant numerical properties of Hamiltonian methods Hairer,Lubich, and Wanner ( e.g. Department of Statistical Science, University College London
References
Absil, P-A, Robert Mahony, and Rodolphe Sepulchre (2009).
Optimization al-gorithms on matrix manifolds . Princeton University Press.Afonso, M.V., J.M. Bioucas-Dias, and M. A T Figueiredo (2011). “An Aug-mented Lagrangian Approach to the Constrained Optimization Formula-tion of Imaging Inverse Problems”. In:
IEEE Trans. Image Processing
SIAM Journal on Scientific and Statistical Computing
Advances in Neural Information Processing Systems
19, p. 129.Beskos, A. et al. (2011). “Hybrid Monte Carlo on Hilbert spaces”. In:
StochasticProcess. Appl. doi : .Beskos, Alexandros et al. (2013). “Optimal Tuning of the Hybrid Monte-CarloAlgorithm”. In: Bernoulli . to appear.Betancourt, Michael (2013). “A General Metric for Riemannian Manifold Hamil-tonian Monte Carlo”. In:
Geometric Science of Information . Ed. by FrankNielsen and Fr´ed´eric Barbaresco. Vol. 8085. Lecture Notes in Computer Sci-ence, pp. 327–334. doi : .Bhattacharya, Abhishek and David B. Dunson (Aug. 2012). “Strong consistencyof nonparametric Bayes density estimation on compact metric spaces withapplications to specific manifolds”. In: Annals of the Institute of StatisticalMathematics
IEEE J. Sel.Topics Appl. Earth Observations Remote Sensing
Fluctuations of the Bose-Einsteincondensate . arXiv: .Christensen, Ole F, Gareth O Roberts, and Martin Sk¨old (2006). “RobustMarkov chain Monte Carlo methods for spatial generalized linear mixedmodels”. In:
Journal of Computational and Graphical Statistics
Fixed-Point Algorithms for Inverse Problems in Sci-ence and Engineering . Ed. by Heinz H. Bauschke et al. Springer New York,pp. 185–212.Diaconis, Persi and David Freedman (1986). “On the Consistency of Bayes Es-timates”. In:
The Annals of Statistics
Annals of Applied Probability , pp. 726–752.Diaconis, Persi, Susan Holmes, and Mehrdad Shahshahani (2012). “SamplingFrom A Manifold”. In:
Advances in Modern Statistical Theory and Applica-tions: A Festschrift in honor of Morris L. Eaton . Ed. by Galin Jones and Xi-aotong Shen. Institute of Mathematical Statistics, pp. 100–122. arXiv: .Diaconis, Persi and Bernd Sturmfels (1998). “Algebraic algorithms for samplingfrom conditional distributions”. In:
The Annals of Statistics
Ultramicroscopy
IEEE Trans. Signal Processing
The Annals of Statistics doi : . 15ederer, Herbert (1969). Geometric measure theory . Die Grundlehren der math-ematischen Wissenschaften, Band 153. Springer-Verlag New York Inc., NewYork.Girolami, Mark and Ben Calderhead (2011). “Riemann manifold Langevin andHamiltonian Monte Carlo methods”. In:
J. R. Stat. Soc. Ser. B Stat. Methodol. doi : .Golbabaee, M., S. Arberet, and P. Vandergheynst (Aug. 2012). CompressiveSource Separation: Theory and Methods for Hyperspectral Imaging . arXiv: .Golbabaee, M. and P. Vandergheynst (2012).
Compressed Sensing of Simulta-neous Low-Rank and Joint-Sparse Matrices . arXiv: .Grayson, Matthew A. (1989). “A short note on the evolution of a surface by itsmean curvature”. In:
Duke Mathematical Journal
Biometrika
Geometric numer-ical integration . Second. Vol. 31. Springer Series in Computational Mathe-matics. Structure-preserving algorithms for ordinary differential equations.Berlin: Springer-Verlag. isbn : 3-540-30663-3; 978-3-540-30663-4.Hoffman, Matt and Andrew Gelman (2013). “The no-U-turn sampler: Adap-tively setting path lengths in Hamiltonian Monte Carlo”. In:
Journal ofMachine Learning Research . to appear.Jones, Galin L. and James P. Hobert (2001). “Honest exploration of intractableprobability distributions via Markov chain Monte Carlo”. In:
Statistical Sci-ence , pp. 312–334.Kent, John T, Asaad M Ganeiber, and Kanti V Mardia (2013).
A new method tosimulate the Bingham and related distributions in directional data analysiswith applications . arXiv: .Kume, A. and Andrew T. A. Wood (2005). “Saddlepoint approximations for theBingham and Fisher-Bingham normalising constants”. In:
Biometrika doi : .Mardia, Kanti V. and Peter E. Jupp (2000). Directional statistics . Wiley Seriesin Probability and Statistics. Chichester: John Wiley & Sons Ltd. isbn : 0-471-95333-4.Marsland, Stephen et al. (Oct. 2012). “Geodesic Warps by Conformal Map-pings”. In:
International Journal of Computer Vision , pp. 1–11.McLachlan, Robert I. and G. R. W. Quispel (2003). “Geometric integration ofconservative polynomial ODEs”. In:
Applied Numerical Mathematics
NeuroImage
23 Suppl 1, S19–S33.Morgan, Frank (2009).
Geometric measure theory . Fourth. A beginner’s guide.Elsevier/Academic Press, Amsterdam. isbn : 978-0-12-374444-9.Neal, Radford M. (2011). “MCMC using Hamiltonian dynamics”. In:
Handbookof Markov chain Monte Carlo . Chapman & Hall/CRC Handb. Mod. Stat.Methods. Boca Raton, FL: CRC Press, pp. 113–162.16vaskainen, Otso and Janne Soininen (2011). “Making more out of sparse data:hierarchical modeling of species communities”. In:
Ecology
Proximal Markov chain Monte Carlo algorithms . arXiv: .Perraul-Joncas, Dominique and Marina Meilˆa (2013).
Non-linear dimensional-ity reduction: Riemannian metric estimation and the problem of geometricdiscovery . arXiv: .Rue, H˚avard (2001). “Fast sampling of Gaussian Markov random fields”. In:
Journal of the Royal Statistical Society: Series B (Statistical Methodology)
Journal of the Royal Statistical Society: Series B (Statis-tical Methodology)
Proceedings of the 21st International Workshopon Statistical Modelling , pp. 448–456.Seiler, Christof, Xavier Pennec, and Susan Holmes (2013). “Random SpatialStructure of Geometric Deformations and Bayesian Nonparametrics”. In:
Geometric Science of Information . Vol. 8085. LNCS. Springer, pp. 120–127.Stan Development Team (2013).
Stan: A C++ Library for Probability and Sam-pling . url : http://mc-stan.org/ .Von Luxburg, Ulrike, Mikhail Belkin, and Olivier Bousquet (2008). “Consistencyof Spectral Clustering”. In: The Annals of Statistics
Proceedings of the 30thInternational Conference on Machine Learning . Ed. by Sanjoy Dasguptaand David McAllester, pp. 1462–1470.Younes, Laurent (May 2010).
Shapes and Diffeomorphisms . First. Vol. 171.Springer. isbn : 3642120547.Zhang, Yichuan et al. (2012). “Continuous Relaxations for Discrete Hamilto-nian Monte Carlo”. In: