Michael P. Holmes
Georgia Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Michael P. Holmes.
international conference on machine learning | 2006
Michael P. Holmes; Charles Lee Isbell
We present a solution for inferring hidden state from sensorimotor experience when the environment takes the form of a POMDP with deterministic transition and observation functions. Such environments can appear to be arbitrarily complex and non-deterministic on the surface, but are actually deterministic with respect to the unobserved underlying state. We show that there always exists a finite history-based representation that fully captures the unobserved world state, allowing for perfect prediction of action effects. This representation takes the form of a looping prediction suffix tree (PST). We derive a sound and complete algorithm for learning a looping PST from a sufficient sample of sensorimotor experience. We also give empirical illustrations of the advantages conferred by this approach, and characterize the approximations to the looping PST that are made by existing algorithms such as Variable Length Markov Models, Utile Suffix Memory and Causal State Splitting Reconstruction.
Computational Statistics & Data Analysis | 2010
Michael P. Holmes; Alexander G. Gray; Charles Lee Isbell
We describe a fast, data-driven bandwidth selection procedure for kernel conditional density estimation (KCDE). Specifically, we give a Monte Carlo dual-tree algorithm for efficient, error-controlled approximation of a cross-validated likelihood objective. While exact evaluation of this objective has an unscalable O(n^2) computational cost, our method is practical and shows speedup factors as high as 286,000 when applied to real multivariate datasets containing up to one million points. In absolute terms, computation times are reduced from months to minutes. This enables applications at much greater scale than previously possible. The core idea in our method is to first derive a standard deterministic dual-tree approximation, whose loose deterministic bounds we then replace with tight, probabilistic Monte Carlo bounds. The resulting Monte Carlo dual-tree algorithm exhibits strong error control and high speedup across a broad range of datasets several orders of magnitude greater in size than those reported in previous work. The cost of this high acceleration is the loss of the formal error guarantee of the deterministic dual-tree framework; however, our experiments show that error is still amply controlled by our Monte Carlo algorithm, and the many-order-of-magnitude speedups are worth this sacrifice in the large-data case, where cross-validated bandwidth selection for KCDE would otherwise be impractical.
knowledge discovery and data mining | 2018
Tom Diethe; Michael P. Holmes; Meelis Kull; Miquel Perello Nieto; Kacper Sokol; Hao Song; Niall Twomey; Peter A. Flach
The SPHERE project is devoted to advancing eHealth in a smart-home context, and supports full-scale sensing and data analysis to enable a generic healthcare service. We describe, from a data-science perspective, our experience of taking the system out of the laboratory into more than thirty homes in Bristol, UK. We describe the infrastructure and processes that had to be developed along the way, describe how we train and deploy Machine Learning systems in this context, and give a realistic appraisal of the state of the deployed systems.
international joint conference on artificial intelligence | 2007
Manu Sharma; Michael P. Holmes; Juan Carlos Santamaria; Arya Irani; Charles Lee Isbell; Ashwin Ram
neural information processing systems | 2004
Michael P. Holmes; Charles
uncertainty in artificial intelligence | 2007
Michael P. Holmes; Alexander G. Gray; Charles Lee Isbell
neural information processing systems | 2008
Michael P. Holmes; Jr. Isbell; Charles Lee; Alexander G. Gray
neural information processing systems | 2010
Harold Pashler; Matthew H. Wilder; Robert V. Lindsey; Matt Jones; Michael C. Mozer; Michael P. Holmes
neural information processing systems | 2007
Michael P. Holmes; Alexander G. Gray; Charles Lee Isbell
neural information processing systems | 2007
Charles Lee Isbell; Michael P. Holmes; Alexander G. Gray