Ignacio Alvarez
Intel
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ignacio Alvarez.
automotive user interfaces and interactive vehicular applications | 2015
Ignacio Alvarez; Laura Rumbel; Robert Adams
This paper introduces Skyline, a user experience prototyping platform for automotive, developed in Intel Labs to enable rapid iterative development of in-vehicle experiences. The paper describes the hardware and software components of Skyline. It highlights the flexibility of the interior HMI configuration and the accessibility of the development platform, based on open source Web technologies such as JavaScript, CSS, Node.js and MQTT. The paper steps through the development and user testing processes for a cockpit experience with Skyline, illustrating the benefits of capturing early qualitative user feedback as support for rapid prototyping. Finally, the paper outlines the potential benefits of high fidelity assets developed on the platform for both industry and academia, and the enormous value that documented user experience HMI assets can have for in-vehicle feature productization and research.
Archive | 2017
Andreas Riener; Myounghoon Jeon; Ignacio Alvarez; Anna K. Frison
Given the rapid advancement of technologies in the automotive domain, driver--vehicle interaction has recently become more and more complicated. The amount of research applied to the vehicle cockpit is increasing, with the advent of (highly) automated driving, as the range of interaction that is possible in a driving vehicle expands. However, as opportunities increase, so does the number of challenges that automotive user experience designers and researchers will face. This chapter focuses on the instrumentation of sensing and displaying techniques and technologies to make better user experience while driving. In the driver--vehicle interaction loop, the vehicle can sense driver states, analyze, estimate, and model the data, and then display it through the appropriate channels for intervention purposes. To improve the interaction, a huge number of new/affordable sensing (EEG, fNIRS, IR imaging) and feedback (head-up displays, auditory feedback, tactile arrays, etc.) techniques have been introduced. However, little research has attempted to investigate this area in a systematic way. This chapter provides an overview of recent advances of input and output modalities to be used for timely, appropriate driver--vehicle interaction. After outlining relevant background, we provide information on the best-known practices for input and output modalities based on the exchange results from the workshop on practical experiences for measuring and modeling drivers and driver--vehicle interactions at AutomotiveUI 2015. This chapter can help answer research questions on how to instrument a driving simulator or realistic study to gather data and how to place interaction outputs to enable appropriate driver interactions.
automotive user interfaces and interactive vehicular applications | 2016
Andreas Riener; Myounghoon Jeon; Ignacio Alvarez; Bastian Pfleging; Alexander G. Mirnig; Manfred Tscheligi; Lewis L. Chuang
On July 1st 2016, the first automated vehicle fatality became headline news [9] and caused a nationwide wave of concern. Now we have at least one situation in which a controlled automated vehicle system failed to detect a life threatening situation. The question still remains: How can an autonomous system make ethical decisions that involve human lives? Control negotiation strategies require prior encoding of ethical conventions into decision making algorithms, which is not at all an easy task -- especially considering that actually coming up with ethically sound decision strategies in the first place is often very difficult, even for human agents. This workshop seeks to provide a forum for experts across different backgrounds to voice and formalize the ethical aspects of automotive user interfaces in the context of automated driving. The goal is to derive working principles that will guide shared decision-making between human drivers and their automated vehicles.
Archive | 2017
Andreas Löcken; Shadan Sadeghian Borojeni; Heiko Müller; Thomas M. Gable; Stefano Triberti; Cyriel Diels; Christiane Glatz; Ignacio Alvarez; Lewis L. Chuang; Susanne Boll
Informing a driver of a vehicle’s changing state and environment is a major challenge that grows with the introduction of in-vehicle assistant and infotainment systems. Even in the age of automation, the human will need to be in the loop for monitoring, taking over control, or making decisions. In these cases, poorly designed systems could lead to needless attentional demands imparted on the driver, taking it away from the primary driving task. Existing systems are offering simple and often unspecific alerts, leaving the human with the demanding task of identifying, localizing, and understanding the problem. Ideally, such systems should communicate information in a way that conveys its relevance and urgency. Specifically, information useful to promote driver safety should be conveyed as effective calls for action, while information not pertaining to safety (therefore less important) should be conveyed in ways that do not jeopardize driver attention. Adaptive ambient displays and peripheral interactions have the potential to provide superior solutions and could serve to unobtrusively present information, to shift the driver’s attention according to changing task demands, or enable a driver to react without losing the focus on the primary task. In order to build a common understanding across researchers and practitioners from different fields, we held a “Workshop on Adaptive Ambient In-Vehicle Displays and Interactions” at the AutomotiveUI‘15 conference. In this chapter, we discuss the outcomes of this workshop, provide examples of possible applications now or in the future and conclude with challenges in developing or using adaptive ambient interactions.
automotive user interfaces and interactive vehicular applications | 2016
Rod McCall; Martin Baumann; Ioannis Politis; Shadan Sadeghian Borojeni; Ignacio Alvarez; Alexander G. Mirnig; Alexander Meschtscherjakov; Manfred Tscheligi; Lewis L. Chuang; Jacques M. B. Terken
This workshop will focus on the problem of occupant and vehicle situational awareness with respect to automated vehicles when the driver must take over control. It will explore the future of fully automated and mixed traffic situations where vehicles are assumed to be operating at level 3 or above. In this case, all critical driving functions will be handled by the vehicle with the possibility of transitions between manual and automated driving modes at any time. This creates a driver environment where, unlike manual driving, there is no direct intrinsic motivation for the driver to be aware of the traffic situation at all times. Therefore, it is highly likely that when such a transition occurs, the driver will not be able to transition either safely or within an appropriate period of time. This workshop will address this challenge by inviting experts and practitioners from the automotive and related domains to explore concepts and solutions to increase, maintain and transfer situational awareness in semi-automated vehicles.
automotive user interfaces and interactive vehicular applications | 2016
Victor Palacios Rivera; Laura Rumbel; Ignacio Alvarez
Automotive user experiences can be increasingly personalized and adaptive thanks to advances in in-vehicle sensors and user modelling but current automotive software development frameworks still require large software development efforts to create custom interaction solutions. In this paper we propose a novel system architecture aimed at supporting automotive researchers and designers by simplifying the prototyping process towards novel adaptive user interfaces. We describe the integration of RealSense sensors and the Context Sensing SDK with the Skyline driving simulator framework. The combination of these tools allows rapid prototyping of in-cabin context aware interactions. The paper presents two use cases of in-cabin-aware prototypes, a user profile loading interface that recognizes identities and occupant roles and an L4 to L3 take-over control interface using RealSense and Context sensing APIs to detect in-vehicle events and Skyline to present real-time adaptive warning interfaces. The resulting experiences are core components of an intelligent ADAS framework for research of IVI personalization and highly automated collaborative driving.
automotive user interfaces and interactive vehicular applications | 2018
Esther Bosch; Michael Oehl; Myounghoon Jeon; Ignacio Alvarez; Jennifer Healey; Wendy Ju; Christophe Jallais
In-car emotion detection and regulation have become an emerging and important branch of research within the automotive domain. Different emotional states can greatly influence human driving performance and user experience both in manual and automated driving conditions. The monitoring and regulation of relevant emotional states is therefore important to avoid critical driving scenarios with the human driver being in charge, and to ensure comfort and acceptance in autonomous driving. In this workshop we want to discuss the empathic user interface research to address challenges and opportunities and to reveal new research directions for future work. This workshop provides a forum for exchange and discussion on empathic user interfaces, including methods for emotion recognition and regulation, empathic automotive human-machine interaction design, user evaluation and measurements, and subsequent improvement of autonomous driving experience.
Archive | 2017
Ignacio Alvarez; Adam Jordan; Juliana Knopf; Darrell LeBlanc; Laura Rumbel; Alexandra C. Zafiroglu
In-vehicle experiences are made up mainly of mundane small moments, repeated practices, and taken-for-granted decisions that make up daily experiences in and around private passenger vehicles. Understanding what those experiences are for drivers around the world presents an opportunity for designing novel interactive experiences, technologies, and user interfaces for vehicles. In this chapter, we present a set of tools, methodologies, and practices that will help reader create a holistic design space for future mobility. Transitioning between ethnography, insights, prototyping, experience design, and requirements decomposition is a challenging task even for experienced UX professionals. This chapter provides guidance in this matter with practical examples.
Archive | 2017
Hanan Alnizami; Ignacio Alvarez; Juan E. Gilbert
Advancements of in-vehicle technologies and the development of mobile applications that keep a driver connected in a driving environment have caused an increasingly dangerous safety concern. Distracted driving has gained the attention of legislators and governments globally. Countries have constituted bans that partially or fully forbid drivers from using gadgets while driving, especially hindering out-of-the-vehicle communications. This paper introduces Voiceing™, a voice-activated application meant to improve social communications in the car, serving as a safe alternative to distracted driving. Other modalities of interaction such as texting, in-vehicle conversations and outside-of-the-vehicle conversation have been measured and compared with Voiceing™ investigating effects on driver’s performance, cognitive load and user acceptance. Results from this study suggest that Voiceing™ is a safer alternative than in-vehicle interactions with humans. Results also show that natural speech interaction of in-vehicle applications and the inclusion of context awareness help improve driving performance while interacting with a vehicle system.
Archive | 2013
Hans-Peter Fischer; Ignacio Alvarez