Mariam Hassib
University of Stuttgart
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mariam Hassib.
human computer interaction with mobile devices and services | 2015
Florian Alt; Stefan Schneegass; Alireza Sahami Shirazi; Mariam Hassib; Andreas Bulling
Common user authentication methods on smartphones, such as lock patterns, PINs, or passwords, impose a trade-off between security and password memorability. Image-based passwords were proposed as a secure and usable alternative. As of today, however, it remains unclear how such schemes are used in the wild. We present the first study to investigate how image-based passwords are used over long periods of time in the real world. Our analyses are based on data from 2318 unique devices collected over more than one year using a custom application released in the Android Play store. We present an in-depth analysis of what kind of images users select, how they define their passwords, and how secure these passwords are. Our findings provide valuable insights into real-world use of image-based passwords and inform the design of future graphical authentication schemes.
human factors in computing systems | 2017
Mariam Hassib; Max Pfeiffer; Stefan Schneegass; Michael Rohs; Florian Alt
The human body reveals emotional and bodily states through measurable signals, such as body language and electroencephalography. However, such manifestations are difficult to communicate to others remotely. We propose EmotionActuator, a proof-of-concept system to investigate the transmission of emotional states in which the recipient performs emotional gestures to understand and interpret the state of the sender.We call this kind of communication embodied emotional feedback, and present a prototype implementation. To realize our concept we chose four emotional states: amused, sad, angry, and neutral. We designed EmotionActuator through a series of studies to assess emotional classification via EEG, and create an EMS gesture set by comparing composed gestures from the literature to sign-language gestures. Through a final study with the end-to-end prototype interviews revealed that participants like implicit sharing of emotions and find the embodied output to be immersive, but want to have control over shared emotions and with whom. This work contributes a proof of concept system and set of design recommendations for designing embodied emotional feedback systems.
human factors in computing systems | 2017
Mariam Hassib; Stefan Schneegass; Philipp Eiglsperger; Niels Henze; Albrecht Schmidt; Florian Alt
Obtaining information about audience engagement in presentations is a valuable asset for presenters in many domains. Prior literature mostly utilized explicit methods of collecting feedback which induce distractions, add workload on audience and do not provide objective information to presenters. We present EngageMeter - a system that allows fine-grained information on audience engagement to be obtained implicitly from multiple brain-computer interfaces (BCI) and to be fed back to presenters for real time and post-hoc access. Through evaluation during an HCI conference (Naudience=11, Npresenters=3) we found that EngageMeter provides value to presenters (a) in real-time, since it allows reacting to current engagement scores by changing tone or adding pauses, and (b) in post-hoc, since presenters can adjust their slides and embed extra elements. We discuss how EngageMeter can be used in collocated and distributed audience sensing as well as how it can aid presenters in long term use.
human factors in computing systems | 2017
Mariam Hassib; Daniel Buschek; Pawel W. Wozniak; Florian Alt
Textual communication via mobile phones suffers from a lack of context and emotional awareness. We present a mobile chat application, HeartChat, which integrates heart rate as a cue to increase awareness and empathy. Through a literature review and a focus group, we identified design dimensions important for heart rate augmented chats. We created three concepts showing heart rate per message, in real-time, or sending it explicitly. We tested our system in a two week in-the-wild study with 14 participants (7 pairs). Interviews and questionnaires showed that HeartChat supports empathy between people, in particular close friends and partners. Sharing heart rate helped them to implicitly understand each others context (e.g. location, physical activity) and emotional state, and sparked curiosity on special occasions. We discuss opportunities, challenges, and design implications for enriching mobile chats with physiological sensing.
international symposium on wearable computers | 2015
Stefan Schneegass; Mariam Hassib; Bo Zhou; Jingyuan Cheng; Fernando Seoane; Oliver Amft; Paul Lukowicz; Albrecht Schmidt
Smart textiles have been researched in the lab over the last 20 years. However, the gap between research and available mass-market products is huge. We identify challenges that are the core reasons for this gap. To tackle these challenges, we present our work towards a multipurpose smart textile with different sensing modalities. It separates the concern of developing textiles, electronics, infrastructure, and applications. Furthermore, it uses a similar application model as current smart-phones allowing developers to create applications for the smart textiles. We believe that this approach is capable of moving smart textiles from niche to mainstream.
Smart Textiles | 2017
Jingyuan Cheng; Bo Zhou; Paul Lukowicz; Fernando Seoane; Matija Varga; Andreas Mehmann; Peter Chabrecek; Werner Gaschler; Karl Goenner; Hansjürgen Horter; Stefan Schneegass; Mariam Hassib; Albrecht Schmidt; Martin Freund; Rui Zhang; Oliver Amft
Textiles are pervasive in our life, covering human body and objects, as well as serving in industrial applications. In its everyday use of individuals, smart textile becomes a promising medium for monitoring, information retrieval, and interaction. While there are many applications in sport, health care, and industry, the state-of-the-art smart textile is still found only in niche markets. To gain mass-market capabilities, we see the necessity of generalizing and modularizing smart textile production and application development, which on the one end lowers the production cost and on the other end enables easy deployment. In this chapter, we demonstrate our initial effort in modularization. By devising types of universal sensing fabrics for conductive and non-conductive patches, smart textile construction from basic, reusable components can be made. Using the fabric blocks, we present four types of sensing modalities, including resistive pressure, capacitive, bioimpedance, and biopotential. In addition, we present a multi-channel textile–electronics interface and various applications built on the top of the basic building blocks by ‘cut and sew’ principle.
human factors in computing systems | 2016
Thomas Kosch; Mariam Hassib; Albrecht Schmidt
As Brain-Computer Interfaces become available to the consumer market, this provides more opportunities in analyzing brain activity in response to different external stimuli. Current output modalities often generate a lot data, such as an electroencephalogram which only displays electrode measurements. We introduce a three-dimensional real-time brain data visualization based on the measured values received by a brain-computer interface. Instead of visualizing the collected voltages by electrodes, we calculate a current density distribution to estimate the origin of electrical source which is responsible for perceived values at electrodes. Understanding where the centers of activation in the brain are allows to better understand the relationship between external stimuli and brain activity. This could be relevant in the context of information presentation for doctors to analyze pathological phenomena. A pilot study was conducted using Virtual Reality as input stimulus. Results indicate visible changes in real-time regarding brain activation.
international symposium on wearable computers | 2014
Stefan Schneegass; Mariam Hassib; Tobias Birmili; Niels Henze
Wearable devices and the development of smart garments emerged into a significant research domain over the last decades. Despite the increasing commercial interest, however, smart garments are almost exclusively developed in academia and the developed systems do not exceed a prototypical level. We argue that the main reason why smart garments cannot be produced on commercially relevant scale today is that they each focus on a specific use case. There is no tool support for application developers and no defined APIs within the software and hardware stack that allows developing useful smart garment applications. In this paper we present our work towards Garment OS, a layered software stack that encapsulates different levels of abstraction. We highlight the design of that system which is based on open web protocols. We present an evaluation with software engineers and derive directions for future work.
mobile and ubiquitous multimedia | 2017
Mariam Hassib; Mohamed Khamis; Susanne Friedl; Stefan Schneegass; Florian Alt
Todays workplaces are dynamic and complex. Digital data sources such as email and video conferencing aim to support workers but also add to their burden of multitasking. Psychophysiological sensors such as Electroencephalography (EEG) can provide users with cues about their cognitive state. We introduce BrainAtWork, a workplace engagement and task logger which shows users their cognitive state while working on different tasks. In a lab study with eleven participants working on their own real-world tasks, we gathered 16 hours of EEG and PC logs which were labeled into three classes: central, peripheral and meta work. We evaluated the usability of BrainAtWork via questionnaires and interviews. We investigated the correlations between measured cognitive engagement from EEG and subjective responses from experience sampling probes. Using random forests classification, we show the feasibility of automatically labeling work tasks into work classes. We discuss how BrainAtWork can support workers on the long term through encouraging reflection and helping in task scheduling.Today’s workplaces are dynamic and complex. Digital data sources such as email and video conferencing aim to support workers but also add to their burden of multitasking. Psychophysiological sensors such as Electroencephalography (EEG) can provide users with cues about their cognitive state. We introduce BrainAtWork, a workplace engagement and task logger which shows users their cognitive state while working on different tasks. In a lab study with eleven participants working on their own real-world tasks, we gathered 16 hours of EEG and PC logs which were labeled into three classes: central, peripheral and meta work. We evaluated the usability of BrainAtWork via questionnaires and interviews. We investigated the correlations between measured cognitive engagement from EEG and subjective responses from experience sampling probes. Using random forests classification, we show the feasibility of automatically labeling work tasks into work classes. We discuss how BrainAtWork can support workers on the long term through encouraging reflection and helping in task scheduling.
augmented human international conference | 2014
Alireza Sahami Shirazi; Mariam Hassib; Niels Henze; Albrecht Schmidt; Kai Kunze