Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Eduardo E. Veas is active.

Publication


Featured researches published by Eduardo E. Veas.


ubiquitous computing | 2009

Handheld Augmented Reality for underground infrastructure visualization

Gerhard Schall; Erick Mendez; Ernst Kruijff; Eduardo E. Veas; Sebastian Junghanns; Bernhard Reitinger; Dieter Schmalstieg

In this paper, we present an Augmented Reality (AR) system for aiding field workers of utility companies in outdoor tasks such as maintenance, planning or surveying of underground infrastructure. Our work addresses these issues using spatial interaction and visualization techniques for mobile AR applications and as well as for a new mobile device design. We also present results from evaluations of the prototype application for underground infrastructure spanning various user groups. Our application has been driven by feedback from industrial collaborators in the utility sector, and includes a translation tool for automatically importing data from utility company databases of underground assets.


human factors in computing systems | 2011

Directing attention and influencing memory with visual saliency modulation

Eduardo E. Veas; Erick Mendez; Steven Feiner; Dieter Schmalstieg

In augmented reality, it is often necessary to draw the users attention to particular objects in the real world without distracting her from her task. We explore the effectiveness of directing a users attention by imperceptibly modifying existing features of a video. We present three user studies of the effects of applying a saliency modulation technique to video; evaluating modulation awareness, attention, and memory. Our results validate the saliency modulation technique as an alternative means to convey information to the user, suggesting attention shifts and influencing recall of selected regions without perceptible changes to visual input.


virtual reality software and technology | 2012

OmniKinect: real-time dense volumetric data acquisition and applications

Bernhard Kainz; Stefan Hauswiesner; Gerhard Reitmayr; Markus Steinberger; Raphael Grasset; Lukas Gruber; Eduardo E. Veas; Denis Kalkofen; Hartmut Seichter; Dieter Schmalstieg

Real-time three-dimensional acquisition of real-world scenes has many important applications in computer graphics, computer vision and human-computer interaction. Inexpensive depth sensors such as the Microsoft Kinect allow to leverage the development of such applications. However, this technology is still relatively recent, and no detailed studies on its scalability to dense and view-independent acquisition have been reported. This paper addresses the question of what can be done with a larger number of Kinects used simultaneously. We describe an interference-reducing physical setup, a calibration procedure and an extension to the KinectFusion algorithm, which allows to produce high quality volumetric reconstructions from multiple Kinects whilst overcoming systematic errors in the depth measurements. We also report on enhancing image based visual hull rendering by depth measurements, and compare the results to KinectFusion. Our system provides practical insight into achievable spatial and radial range and into bandwidth requirements for depth data acquisition. Finally, we present a number of practical applications of our system.


international symposium on mixed and augmented reality | 2013

Adaptive ghosted views for Augmented Reality

Denis Kalkofen; Eduardo E. Veas; Stefanie Zollmann; Markus Steinberger; Dieter Schmalstieg

In Augmented Reality (AR), ghosted views allow a viewer to explore hidden structure within the real-world environment. A body of previous work has explored which features are suitable to support the structural interplay between occluding and occluded elements. However, the dynamics of AR environments pose serious challenges to the presentation of ghosted views. While a model of the real world may help determine distinctive structural features, changes in appearance or illumination detriment the composition of occluding and occluded structure. In this paper, we present an approach that considers the information value of the scene before and after generating the ghosted view. Hereby, a contrast adjustment of preserved occluding features is calculated, which adaptively varies their visual saliency within the ghosted view visualization. This allows us to not only preserve important features, but to also support their prominence after revealing occluded structure, thus achieving a positive effect on the perception of ghosted views.


IEEE Transactions on Visualization and Computer Graphics | 2012

Extended Overview Techniques for Outdoor Augmented Reality

Eduardo E. Veas; Raphael Grasset; Ernst Kruijff; Dieter Schmalstieg

In this paper, we explore techniques that aim to improve site understanding for outdoor Augmented Reality (AR) applications. While the first person perspective in AR is a direct way of filtering and zooming on a portion of the data set, it severely narrows overview of the situation, particularly over large areas. We present two interactive techniques to overcome this problem: multi-view AR and variable perspective view. We describe in details the conceptual, visualization and interaction aspects of these techniques and their evaluation through a comparative user study. The results we have obtained strengthen the validity of our approach and the applicability of our methods to a large range of application domains.


ubiquitous computing | 2013

Mobile augmented reality for environmental monitoring

Eduardo E. Veas; Raphael Grasset; Ioan Ferencik; Thomas Grünewald; Dieter Schmalstieg

In response to dramatic changes in the environment, and supported by advances in wireless networking, pervasive sensor networks have become a common tool for environmental monitoring. However, tools for on-site visualization and interactive exploration of environmental data are still inadequate for domain experts. Current solutions are generally limited to tabular data, basic 2D plots, or standard 2D GIS tools designed for the desktop and not adapted to mobile use. In this paper, we introduce a novel augmented reality platform for 3D mobile visualization of environmental data. Following a user-centered design approach, we analyze processes, tasks, and requirements of on-site visualization tools for environmental experts. We present our multilayer infrastructure and the mobile augmented reality platform that leverages visualization of georeferenced sensor measurement and simulation data in a seamless integrated view of the environment.


international symposium on mixed and augmented reality | 2008

Vesp’R: design and evaluation of a handheld AR device

Eduardo E. Veas; Ernst Kruijff

This paper focuses on the design of devices for handheld spatial interaction. In particular, it addresses the requirements and construction of a new platform for interactive AR, described from an ergonomics stance, prioritizing human factors of spatial interaction. The result is a multi-configurable platform for spatial interaction, evaluated in two AR application scenarios. The user tests validate the design with regards to grip, weight balance and control allocation, and provide new insights on the human factors involved in handheld spatial interaction.


international semantic web conference | 2014

Discovery and Visual Analysis of Linked Data for Humans

Vedran Sabol; Gerwald Tschinkel; Eduardo E. Veas; Patrick Hoefler; Belgin Mutlu; Michael Granitzer

Linked Data has grown to become one of the largest available knowledge bases. Unfortunately, this wealth of data remains inaccessible to those without in-depth knowledge of semantic technologies. We describe a toolchain enabling users without semantic technology background to explore and visually analyse Linked Data. We demonstrate its applicability in scenarios involving data from the Linked Open Data Cloud, and research data extracted from scientific publications. Our focus is on the Web-based front-end consisting of querying and visualisation tools. The performed usability evaluations unveil mainly positive results confirming that the Query Wizard simplifies searching, refining and transforming Linked Data and, in particular, that people using the Visualisation Wizard quickly learn to perform interactive analysis tasks on the resulting Linked Data sets. In making Linked Data analysis effectively accessible to the general public, our tool has been integrated in a number of live services where people use it to analyse, discover and discuss facts with Linked Data.


international symposium on wearable computers | 2016

Skin Reading: encoding text in a 6-channel haptic display

Granit Luzhnica; Eduardo E. Veas; Viktoria Pammer

This paper investigates the communication of natural language messages using a wearable haptic display. Our research spans both the design of the haptic display, as well as the methods for communication that use it. First, three wearable configurations are proposed basing on haptic perception fundamentals. To encode symbols, we devise an overlapping spatiotemporal stimulation (OST) method, that distributes stimuli spatially and temporally with a minima gap. An empirical study shows that, compared with spatial stimulation, OST is preferred in terms of recall. Second, we propose an encoding for the entire English alphabet and a training method for letters, words and phrases. A second study investigates communication accuracy. It puts four participants through five sessions, for an overall training time of approximately 5 hours per participant. Results reveal that after one hour of training, participants were able to discern 16 letters, and identify two- and three-letter words. They could discern the full English alphabet (26 letters, 92% accuracy) after approximately three hours of training, and after five hours participants were able to interpret words transmitted at an average duration of 0.6s per word.


mobile and ubiquitous multimedia | 2010

Handheld devices for mobile augmented reality

Eduardo E. Veas; Ernst Kruijff

In this paper, we report on four generations of display-sensor platforms for handheld augmented reality. The paper is organized as a compendium of requirements that guided the design and construction of each generation of the handheld platforms. The first generation, reported in [17]), was a result of various studies on ergonomics and human factors. Thereafter, each following iteration in the design-production process was guided by experiences and evaluations that resulted in new guidelines for future versions. We describe the evolution of hardware for handheld augmented reality, the requirements and guidelines that motivated its construction.

Collaboration


Dive into the Eduardo E. Veas's collaboration.

Top Co-Authors

Avatar

Vedran Sabol

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar

Dieter Schmalstieg

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Viktoria Pammer

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar

Denis Kalkofen

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar

Ernst Kruijff

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar

Carla Barreiros

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hartmut Seichter

Graz University of Technology

View shared research outputs
Top Co-Authors

Avatar

Markus Tatzgern

Graz University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge