István Barakonyi
Graz University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by István Barakonyi.
international symposium on mixed and augmented reality | 2003
Daniel Wagner; István Barakonyi
ARToolKit programmers are familiar with the kanji symbols supplied with the distribution. Most of them do not know what these kanji symbols mean. We propose a piece of educational software that uses collaborative augmented reality (AR) to teach users the meaning of kanji symbols. The application is laid out as a two player augmented reality computer game. The novelty of our approach is that we do not use regular workstations or laptops to host the AR (augmented reality) application. Instead we use fully autonomous PDAs, running the application together with an optical marker-based tracking module that makes this application not only available for a broad audience but also optimally mobile.
international symposium on mixed and augmented reality | 2004
István Barakonyi; Thomas Psik; Dieter Schmalstieg
AR puppet is a hierarchical animation framework for augmented reality agents, which is a research area combining augmented reality (AR), sentient computing and autonomous animated agents into a single coherent human-computer interface paradigm. While sentient computing systems use the physical environment as an input channel, AR outputs virtual information superimposed on real world objects. To enhance man-machine communication with more natural and efficient information presentation, this framework adds animated agents to AR applications that make autonomous decisions based on their perception of the real environment. These agents are able to turn physical objects into interactive, responsive entities collaborating with both anthropomorphic and non-anthropomorphic virtual characters, extending AR with a previously unexplored output modality. AR puppet explores the requirements for context-aware animated agents concerning visualization, appearance, behavior, in addition to associated technologies and application areas. A demo application with a virtual repairman collaborating with an augmented LEGO/spl reg/ robot illustrates our concepts.
IEEE Computer Graphics and Applications | 2007
Dieter Schmalstieg; Gerhard Schall; Daniel Wagner; István Barakonyi; Gerhard Reitmayr; Joseph Newman; Florian Ledermann
Mobile augmented reality requires georeferenced data to present world-registered overlays. To cover a wide area and all artifacts and activities, a database containing this information must be created, stored, maintained, delivered, and finally used by the client application. We present a data model and a family of techniques to address these needs.
international symposium on mixed and augmented reality | 2006
István Barakonyi; Dieter Schmalstieg
Most of todays Augmented Reality (AR) systems operate as passive information browsers relying on a finite and deterministic world model and a predefined hardware and software infrastructure. We propose an AR framework that dynamically and proactively exploits hitherto unknown applications and hardware devices, and adapts the appearance of the user interface to persistently stored and accumulated user preferences. Our framework explores proactive computing, multi-user interface adaptation, and user interface migration. We employ mobile and autonomous agents embodied by real and virtual objects as an interface and interaction metaphor, where agent bodies are able to opportunistically migrate between multiple AR applications and computing platforms to best match the needs of the current application context. We present two pilot applications to illustrate design concepts.
international conference on entertainment computing | 2005
István Barakonyi; Dieter Schmalstieg
Augmented reality (AR) has recently stepped beyond the usual scope of applications like machine maintenance, military training and production, and has been extended to the realm of entertainment including computer gaming. This paper discusses the potential AR environments offer for embodied animated agents, and demonstrates several advanced immersive content and scenario authoring techniques in AR through example applications.
IEEE MultiMedia | 2004
Naiwala P. Chandrasiri; Takeshi Naemura; Mitsuru Ishizuka; Hiroshi Harashima; István Barakonyi
In this paper, the authors have developed a system that animates 3D facial agents based on real-time facial expression analysis techniques and research on synthesizing facial expressions and text-to-speech capabilities. This system combines visual, auditory, and primary interfaces to communicate one coherent multimodal chat experience. Users can represent themselves using agents they select from a group that we have predefined. When a user shows a particular expression while typing a text, the 3D agent at the receiving end speaks the message aloud while it replays the recognized facial expression sequences and also augments the synthesized voice with appropriate emotional content. Because the visual data exchange is based on the MPEG-4 high-level Facial Animation Parameter for facial expressions (FAP 2), rather than real-time video, the method requires very low bandwidth.
symposium on 3d user interfaces | 2007
István Barakonyi; Helmut Prendinger; Dieter Schmalstieg; Mitsuru Ishizuka
We have implemented an augmented reality videoconferencing system that inserts virtual graphics overlays into the live video stream of remote conference participants. The virtual objects are manipulated using a novel interaction technique cascading bimanual tangible interaction and eye tracking. User studies prove that our user interface enriches remote collaboration by offering hitherto unexplored ways for collaborative object manipulation such as gaze controlled raypicking of remote physical and virtual objects
international conference on computer graphics and interactive techniques | 2006
István Barakonyi; Dieter Schmalstieg
Modelers and animators often rely on real-life references to build and animate 3D characters for games and film production. Videotaping a real subject and the manipulation of mock-ups support the creation of precise and expressive character animation in virtual content creation environments such as 3D modeling and animation packages. We propose to use Augmented Reality (AR) to bridge the gap between physical and virtual production environments by superimposing 3D graphics on real world objects. We have created tools (see Figure 1 for illustrations) based on the Studierstube AR framework [Schmalstieg et al. 2002] to improve the character animation pipeline by exploiting the physical world as a user interface.
IEEE Pervasive Computing | 2009
Alessandro Mulloni; Daniel Wagner; István Barakonyi; Dieter Schmalstieg
graphics interface | 2004
István Barakonyi; Tamer Fahmy; Dieter Schmalstieg