James Imber
University of Surrey
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by James Imber.
european conference on computer vision | 2014
James Imber; Jean-Yves Guillemaut; Adrian Hilton
This paper presents an approach to estimate the intrinsic texture properties (albedo, shading, normal) of scenes from multiple view acquisition under unknown illumination conditions. We introduce the concept of intrinsic textures, which are pixel-resolution surface textures representing the intrinsic appearance parameters of a scene. Unlike previous video relighting methods, the approach does not assume regions of uniform albedo, which makes it applicable to richly textured scenes. We show that intrinsic image methods can be used to refine an initial, low-frequency shading estimate based on a global lighting reconstruction from an original texture and coarse scene geometry in order to resolve the inherent global ambiguity in shading. The method is applied to relighting of free-viewpoint rendering from multiple view video capture. This demonstrates relighting with reproduction of fine surface detail. Quantitative evaluation on synthetic models with textured appearance shows accurate estimation of intrinsic surface reflectance properties.
computer vision computer graphics collaboration techniques | 2013
James Imber; Marco Volino; Jean-Yves Guillemaut; Simon Fenney; Adrian Hilton
Free-viewpoint video renderers (FVVR) allow a user to view captured video footage from any position and direction. Despite the obvious appeal of such systems, they have yet to make a major impact on digital entertainment. Current FVVR implementations have been on desktop computers. Media consumption is increasingly through mobile devices, such as smart phones and tablets; adapting FVVR to mobile platforms will open this new form of media up to a wider audience. An efficient, high-quality FVVR, which runs in real time with user interaction on a mobile device, is presented. Performance is comparable to recent desktop implementations. The FVVR supports relighting and integration of relightable free-viewpoint video (FVV) content into computer-generated scenes. A novel approach to relighting FVVR content is presented which does not require prior knowledge of the scene illumination or accurate surface geometry. Surface appearance is separated into a detail component, and a set of materials with properties determining surface colour and specular behaviour. This allows plausible relighting of the dynamic FVV for rendering on mobile devices.
Archive | 2016
James Imber; Adrian Hilton
Archive | 2014
James Imber; Adrian Hilton; Jean-Yves Guillemaut
Archive | 2017
James Imber; Adrian Hilton; Jean-Yves Guillemaut
Archive | 2014
James Imber; Adrian Hilton; Jean-Yves Guillemaut
Archive | 2017
James Imber; Adrian Hilton; Jean-Yves Guillemaut
Archive | 2017
James Imber; Adrian Hilton; Jean-Yves Guillemaut
Archive | 2017
James Imber; Adrian Hilton; Jean-Yves Guillemaut
Archive | 2017
James Imber; Adrian Hilton; Jean-Yves Guillemaut