Stefanie Zollmann
Graz University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stefanie Zollmann.
ubiquitous computing | 2012
Tobias Langlotz; Stefan Mooslechner; Stefanie Zollmann; Claus Degendorfer; Gerhard Reitmayr; Dieter Schmalstieg
We present a novel system allowing in situ content creation for mobile Augmented Reality in unprepared environments. This system targets smartphones and therefore allows a spontaneous authoring while in place. We describe two different scenarios, which are depending on the size of the working environment and consequently use different tracking techniques. A natural feature-based approach for planar targets is used for small working spaces whereas for larger working environments, such as in outdoor scenarios, a panoramic-based orientation tracking is deployed. Both are integrated into one system allowing the user to use the same interaction for creating the content applying a set of simple, yet powerful modeling functions for content creation. The resulting content for Augmented Reality can be shared with other users using a dedicated content server or kept in a private inventory for later use.
international symposium on mixed and augmented reality | 2010
Stefanie Zollmann; Denis Kalkofen; Erick Mendez; Gerhard Reitmayr
In augmented reality displays, X-Ray visualization techniques make hidden objects visible through combining the physical view with an artificial rendering of the hidden information. An important step in X-Ray visualization is to decide which parts of the physical scene should be kept and which should be replaced by overlays. The combination should provide users with essential perceptual cues to understand the relationship of depth between hidden information and the physical scene. In this paper we present an approach that addresses this decision in unknown environments by analyzing camera images of the physical scene and using the extracted information for occlusion management. Pixels are grouped into perceptually coherent image regions and a set of parameters is determined for each region. The parameters change the X-Ray visualization for either preserving existing structures or generating synthetic structures. Finally, users can customize the overall opacity of foreground regions to adapt the visualization.
ubiquitous computing | 2013
Gerhard Schall; Stefanie Zollmann; Gerhard Reitmayr
Many civil engineering tasks require to access geospatial data in the field and reference the stored information to the real-world situation. Augmented reality (AR), which interactively overlays 3D graphical content directly over a view of the world, can be a useful tool to visualize but also create, edit and update geospatial data representing real-world artifacts. We present research results on the next-generation field information system for companies relying on geospatial data, providing mobile workforces with capabilities for on-site inspection and planning, data capture and as-built surveying. To achieve this aim, we used mobile AR technology for on-site surveying of geometric and semantic attributes of geospatial 3D models on the user’s handheld device. The interactive 3D visualizations automatically generated from production databases provide immediate visual feedback for many tasks and lead to a round-trip workflow where planned data are used as a basis for as-built surveying through manipulation of the planned data. Classically, surveying of geospatial objects is a typical scenario performed from utility companies on a daily basis. We demonstrate a mobile AR system that is capable of these operations and present first field trials with expert end users from utility companies. Our initial results show that the workflows of planning and surveying of geospatial objects benefit from our AR approach.
Proceedings of the IEEE | 2014
Stefanie Zollmann; Christof Hoppe; Stefan Kluckner; Christian Poglitsch; Horst Bischof; Gerhard Reitmayr
Augmented reality (AR) allows for an on-site presentation of information that is registered to the physical environment. Applications from civil engineering, which require users to process complex information, are among those which can benefit particularly highly from such a presentation. In this paper, we will describe how to use AR to support monitoring and documentation of construction site progress. For these tasks, the responsible staff usually requires fast and comprehensible access to progress information to enable comparison to the as-built status as well as to as-planned data. Instead of tediously searching and mapping related information to the actual construction site environment, our AR system allows for the access of information right where it is needed. This is achieved by superimposing progress as well as as-planned information onto the users view of the physical environment. For this purpose, we present an approach that uses aerial 3-D reconstruction to automatically capture progress information and a mobile AR client for on-site visualization. Within this paper, we will describe in greater detail how to capture 3-D, how to register the AR system within the physical outdoor environment, how to visualize progress information in a comprehensible way in an AR overlay, and how to interact with this kind of information. By implementing such an AR system, we are able to provide an overview about the possibilities and future applications of AR in the construction industry.
international symposium on mixed and augmented reality | 2010
Lukas Gruber; Steffen Gauglitz; Jonathan Ventura; Stefanie Zollmann; Manuel J. Huber; Michael Schlegel; Gudrun Klinker; Dieter Schmalstieg; Tobias Höllerer
We describe the design and implementation of a physical and virtual model of an imaginary urban scene—the “City of Sights”— that can serve as a backdrop or “stage” for a variety of Augmented Reality (AR) research. We argue that the AR research community would benefit from such a standard model dataset which can be used for evaluation of such AR topics as tracking systems, modeling, spatial AR, rendering tests, collaborative AR and user interface design. By openly sharing the digital blueprints and assembly instructions for our models, we allow the proposed set to be physically replicable by anyone and permit customization and experimental changes to the stage design which enable comprehensive exploration of algorithms and methods. Furthermore we provide an accompanying rich dataset consisting of video sequences under varying conditions with ground truth camera pose. We employed three different ground truth acquisition methods to support a broad range of use cases. The goal of our design is to enable and improve the replicability and evaluation of future augmented reality research.
international symposium on mixed and augmented reality | 2013
Denis Kalkofen; Eduardo E. Veas; Stefanie Zollmann; Markus Steinberger; Dieter Schmalstieg
In Augmented Reality (AR), ghosted views allow a viewer to explore hidden structure within the real-world environment. A body of previous work has explored which features are suitable to support the structural interplay between occluding and occluded elements. However, the dynamics of AR environments pose serious challenges to the presentation of ghosted views. While a model of the real world may help determine distinctive structural features, changes in appearance or illumination detriment the composition of occluding and occluded structure. In this paper, we present an approach that considers the information value of the scene before and after generating the ghosted view. Hereby, a contrast adjustment of preserved occluding features is calculated, which adaptively varies their visual saliency within the ghosted view visualization. This allows us to not only preserve important features, but to also support their prominence after revealing occluded structure, thus achieving a positive effect on the perception of ghosted views.
IEEE Transactions on Visualization and Computer Graphics | 2017
Jens Grubert; Tobias Langlotz; Stefanie Zollmann; Holger Regenbrecht
Augmented Reality is a technique that enables users to interact with their physical environment through the overlay of digital information. While being researched for decades, more recently, Augmented Reality moved out of the research labs and into the field. While most of the applications are used sporadically and for one particular task only, current and future scenarios will provide a continuous and multi-purpose user experience. Therefore, in this paper, we present the concept of Pervasive Augmented Reality, aiming to provide such an experience by sensing the user’s current context and adapting the AR system based on the changing requirements and constraints. We present a taxonomy for Pervasive Augmented Reality and context-aware Augmented Reality, which classifies context sources and context targets relevant for implementing such a context-aware, continuous Augmented Reality experience. We further summarize existing approaches that contribute towards Pervasive Augmented Reality. Based our taxonomy and survey, we identify challenges for future research directions in Pervasive Augmented Reality.
eurographics | 2007
Stefanie Zollmann; Oliver Bimber
We present a novel multi-step technique for imperceptible geometry and radiometry calibration of projectorcamera systems. Our approach can be used to display geometry and color corrected images on non-optimized surfaces at interactive rates while simultaneously performing a series of invisible structured light projections during runtime. It supports disjoint projector-camera configurations, fast and progressive improvements, as well as real-time correction rates of arbitrary graphical content. The calibration is automatically triggered when misregistrations between camera, projector and surface are detected.
international symposium on mixed and augmented reality | 2012
Stefanie Zollmann; Denis Kalkofen; Christof Hoppe; Stefan Kluckner; Horst Bischof; Gerhard Reitmayr
In this paper we present an approach for visualizing time-oriented data of dynamic scenes in an on-site AR view. Visualizations of time-oriented data have special challenges compared to the visualization of arbitrary virtual objects. Usually, the 4D data occludes a large part of the real scene. Additionally, the data sets from different points in time may occlude each other. Thus, it is important to design adequate visualization techniques that provide a comprehensible visualization. In this paper we introduce a visualization concept that uses overview and detail techniques to present 4D data in different detail levels. These levels provide at first an overview of the 4D scene, at second information about the 4D change of a single object and at third detailed information about object appearance and geometry for specific points in time. Combining the three levels of detail with interactive transitions such as magic lenses or distorted viewing techniques enables the user to understand the relationship between them. Finally we show how to apply this concept for construction site documentation and monitoring.
IEEE Transactions on Visualization and Computer Graphics | 2014
Stefanie Zollmann; Christof Hoppe; Tobias Langlotz; Gerhard Reitmayr
Micro aerial vehicles equipped with high-resolution cameras can be used to create aerial reconstructions of an area of interest. In that context automatic flight path planning and autonomous flying is often applied but so far cannot fully replace the human in the loop, supervising the flight on-site to assure that there are no collisions with obstacles. Unfortunately, this workflow yields several issues, such as the need to mentally transfer the aerial vehicles position between 2D map positions and the physical environment, and the complicated depth perception of objects flying in the distance. Augmented Reality can address these issues by bringing the flight planning process on-site and visualizing the spatial relationship between the planned or current positions of the vehicle and the physical environment. In this paper, we present Augmented Reality supported navigation and flight planning of micro aerial vehicles by augmenting the users view with relevant information for flight planning and live feedback for flight supervision. Furthermore, we introduce additional depth hints supporting the user in understanding the spatial relationship of virtual waypoints in the physical world and investigate the effect of these visualization techniques on the spatial understanding.