Maha El Choubassi
Intel
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Maha El Choubassi.
conference on multimedia modeling | 2010
Maha El Choubassi; Oscar Nestares; Yi Wu; Igor Kozintsev; Horst W. Haussecker
We present an augmented reality tourist guide on mobile devices. Many of latest mobile devices contain cameras, location, orientation and motion sensors. We demonstrate how these devices can be used to bring tourism information to users in a much more immersive manner than traditional text or maps. Our system uses a combination of camera, location and orientation sensors to augment live camera view on a device with the available information about the objects in the view. The augmenting information is obtained by matching a camera image to images in a database on a server that have geotags in the vicinity of the user location. We use a subset of geotagged English Wikipedia pages as the main source of images and augmenting text information. At the time of publication our database contained 50 K pages with more than 150 K images linked to them. A combination of motion estimation algorithms and orientation sensors is used to track objects of interest in the live camera view and place augmented information on top of them.
international conference on multimedia and expo | 2011
Gabriel Takacs; Maha El Choubassi; Yi Wu; Igor Kozintsev
In this paper, we present a large-scale mobile augmented reality system that recognizes the buildings in the mobile devices live video and registers this live view with the 3-dimensional models of the buildings. Having the camera pose estimated and tracked, the system adds relevant information about the buildings to the video in the correct perspective. We demonstrate the system on a large database of geo-tagged panoramic images of an urban environment with associated 3-dimensional planar models. The system uses the capabilities of emerging mobile platforms such as location and orientation sensors, and computational power to detect, track, and augment buildings in urban scenes.
international symposium on mixed and augmented reality | 2011
Yi Wu; Maha El Choubassi; Igor Kozintsev
We describe an augmented reality prototype for exploring a 3D urban environment on mobile devices. Our system utilizes the location and orientation sensors on the mobile platform as well as computer vision techniques to register the live view of the device with the 3D urban data. In particular, the system recognizes the buildings in the live video, tracks the camera pose, and augments the video with relevant information about the buildings in the correct perspective. The 3D urban data consist of 3D point clouds and corresponding geo-tagged RGB images of the urban environment. We also discuss the processing steps to make such 3D data scalable and usable by our system.
Archive | 2010
Douglas Gray; Yi Wu; Igor Kozintsev; Horst W. Haussecker; Maha El Choubassi
Archive | 2011
Yi Wu; Maha El Choubassi; Igor Kozintsev; Richard Beckwith; Kenneth T. Anderson; Maria Bezaitis
Archive | 2011
Yi Wu; Gabriel Takacs; Maha El Choubassi; Igor Kozintsev
Archive | 2010
Maha El Choubassi; Igor Kozintsev; Yi Wu; Yoram Gat; Horst W. Haussecker
Archive | 2011
Joshua J. Ratcliff; Yi Wu; Maha El Choubassi; Yoram Gat; Wei Victoria Sun; Kalpana Seshadrinathan; Igor Kozintsev
Archive | 2014
Yan Xu; Maha El Choubassi; Joshua J. Ratcliff
Archive | 2018
Maha El Choubassi; Oscar Nestares