Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael A. Nees is active.

Publication


Featured researches published by Michael A. Nees.


Reviews of Human Factors and Ergonomics | 2011

Auditory displays for in-vehicle technologies

Michael A. Nees; Bruce N. Walker

Modern vehicle cockpits have begun to incorporate a number of information-rich technologies, including systems to enhance and improve driving and navigation performance and also driving-irrelevant information systems. The visually intensive nature of the driving task requires these systems to adopt primarily nonvisual means of information display, and the auditory modality represents an obvious alternative to vision for interacting with in-vehicle technologies (IVTs). Although the literature on auditory displays has grown tremendously in recent decades, to date, few guidelines or recommendations exist to aid in the design of effective auditory displays for IVTs. This chapter provides an overview of the current state of research and practice with auditory displays for IVTs. The role of basic auditory capabilities and limitations as they relate to in-vehicle auditory display design are discussed. Extant systems and prototypes are reviewed, and when possible, design recommendations are made. Finally, research needs and an iterative design process to meet those needs are discussed. Keywords: Driver distraction; Language: en


International Journal of Human-computer Interaction | 2015

Menu Navigation With In-Vehicle Technologies: Auditory Menu Cues Improve Dual Task Performance, Preference, and Workload

Myounghoon Jeon; Thomas M. Gable; Benjamin K. Davison; Michael A. Nees; Jeff Wilson; Bruce N. Walker

Auditory display research for driving has mainly examined a limited range of tasks (e.g., collision warnings, cell phone tasks). In contrast, the goal of this project was to evaluate the effectiveness of enhanced auditory menu cues in a simulated driving context. The advanced auditory cues of “spearcons” (compressed speech cues) and “spindex” (a speech-based index cue) were predicted to improve both menu navigation and driving. Two experiments used a dual task paradigm in which users selected songs on the vehicle’s infotainment system. In Experiment 1, 24 undergraduates played a simple, perceptual-motor ball-catching game (the primary task; a surrogate for driving), and navigated through an alphabetized list of 150 song titles—rendered as an auditory menu—as a secondary task. The menu was presented either in the typical visual-only manner, or enhanced with text-to-speech (TTS), or TTS plus one of three types of additional auditory cues. In Experiment 2, 34 undergraduates conducted the same secondary task while driving in a simulator. In both experiments, performance on both the primary task (success rate of the game or driving performance) and the secondary task (menu search time) was better with the auditory menus than with no sound. Perceived workload scores as well as user preferences favored the enhanced auditory cue types. These results show that adding audio, and enhanced auditory cues in particular, can allow a driver to operate the menus of in-vehicle technologies more efficiently while driving more safely. Results are discussed with multiple resources theory.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2016

Acceptance of Self-driving Cars: An Examination of Idealized versus Realistic Portrayals with a Self- driving Car Acceptance Scale

Michael A. Nees

Despite enthusiastic speculation about the potential benefits of self-driving cars, to date little is known about the factors that will affect drivers’ acceptance or rejection of this emerging technology. Gaining acceptance from end users will be critical to the widespread deployment of self-driving vehicles. Long-term acceptance may be harmed if initial acceptance is built upon unrealistic expectations developed before people interact with these systems. A brief (24-item) measurement scale was created to assess acceptance of self-driving cars. Before completing the scale, participants were randomly assigned to read short vignettes that featured either a realistic or an idealistic description of a friend’s experiences during the first six months of owning a self-driving car. A small but significant effect showed that reading an idealized portrayal in the vignette resulted in higher acceptance of self-driving cars. Potential factors affecting user acceptance of self-driving cars are discussed. Establishing realistic expectations about the performance of automation before users interact with self-driving cars may be important for long-term acceptance.


Human Factors | 2016

Speech Auditory Alerts Promote Memory for Alerted Events in a Video-Simulated Self-Driving Car Ride

Michael A. Nees; Benji Helbein; Anna Porter

Objective: Auditory displays could be essential to helping drivers maintain situation awareness in autonomous vehicles, but to date, few or no studies have examined the effectiveness of different types of auditory displays for this application scenario. Background: Recent advances in the development of autonomous vehicles (i.e., self-driving cars) have suggested that widespread automation of driving may be tenable in the near future. Drivers may be required to monitor the status of automation programs and vehicle conditions as they engage in secondary leisure or work tasks (entertainment, communication, etc.) in autonomous vehicles. Method: An experiment compared memory for alerted events—a component of Level 1 situation awareness—using speech alerts, auditory icons, and a visual control condition during a video-simulated self-driving car ride with a visual secondary task. The alerts gave information about the vehicle’s operating status and the driving scenario. Results: Speech alerts resulted in better memory for alerted events. Both auditory display types resulted in less perceived effort devoted toward the study tasks but also greater perceived annoyance with the alerts. Conclusion: Speech auditory displays promoted Level 1 situation awareness during a simulation of a ride in a self-driving vehicle under routine conditions, but annoyance remains a concern with auditory displays. Application: Speech auditory displays showed promise as a means of increasing Level 1 situation awareness of routine scenarios during an autonomous vehicle ride with an unrelated secondary task.


Ergonomics | 2015

A comparison of human versus virtual interruptions

Michael A. Nees; Anjali Fortna

Although a wealth of research has examined the effects of virtual interruptions, human-initiated interruptions are common in many work settings. An experiment compared performance on a primary data-entry task during human-initiated (human) versus computer-initiated (virtual) interruptions. Participants completed blocks of trials that featured either an interruption from a computer or an interruption from a human experimenter. The timing of the onset of the interruptions was also varied across trials. Human interruptions resulted in much shorter interruption lags. No significant differences were observed for the number of correct responses on the primary task for human versus virtual interruptions, but interruptions that occurred later in the task sequence resulted in fewer mistakes. The social aspect of human interruptions may have attenuated interruption lags in that condition, and it is possible that virtual interruptions may permit people greater temporal flexibility in managing their engagement with interruptions. Practitioner Summary: An experiment compared human- and computer-initiated interruptions of a verbal data-entry task. Human-initiated interruptions resulted in much shorter interruption lags. Virtual interruptions may permit people greater temporal flexibility in managing their engagement with interruptions.


Journal of cognitive psychology | 2013

Flexibility of working memory encoding in a sentence–picture–sound verification task

Michael A. Nees; Bruce N. Walker

Dual-process accounts of working memory have suggested distinct encoding processes for verbal and visual information in working memory, but encoding for nonspeech sounds (e.g., tones) is not well understood. This experiment modified the sentence–picture verification task to include nonspeech sounds with a complete factorial examination of all possible stimulus pairings. Participants studied simple stimuli–pictures, sentences, or sounds–and encoded the stimuli verbally, as visual images, or as auditory images. Participants then compared their encoded representations to verification stimuli–again pictures, sentences, or sounds–in a two-choice reaction time task. With some caveats, the encoding strategy appeared to be as important or more important than the external format of the initial stimulus in determining the speed of verification decisions. Findings suggested that: (1) auditory imagery may be distinct from verbal and visuospatial processing in working memory; (2) visual perception but not visual imagery may automatically activate concurrent verbal codes; and (3) the effects of hearing a sound may linger for some time despite recoding in working memory. We discuss the role of auditory imagery in dual-process theories of working memory.


Mobile media and communication | 2017

Restricting mobile phone access during homework increases attainment of study goals

Chelsea M. Cutino; Michael A. Nees

Recent research has reported negative consequences, such as increased anxiety, associated with restricting people’s access to their mobile phones. These findings have led researchers to suggest that mobile phone use may pose a legitimately addictive behavior for some people. Other research has suggested negative effects of mobile phones on academic outcomes. To study the effects of phone separation on both anxiety and attainment of academic study goals, we randomly assigned participants (N = 93) to a restricted mobile phone access condition or a control condition. After setting a list of goals for a study session, participants worked on their own, self-chosen class materials for 60 minutes. Anxiety was measured before and immediately following the study session. Attainment of study goals was assessed through a self-report estimate of the percent of study goals accomplished at the end of the session. We predicted that those who classified as high on a problematic mobile phone use scale and who had their phones taken away would show the greatest increases in anxiety over the session as well as the greatest deficits in attainment of study goals as compared to all other participants. While there was a general tendency for participants who scored higher on the problematic use scale to be more anxious, anxiety did not differ between participants with phone access and those without it. Participants without phone access self-reported attainment of 12% more of their study goals than those who had phones. This study qualified the conditions for which restricting mobile phone access increases anxiety and provided further empirical support for detriments to attainment of study goals when mobile phones are present.


Frontiers in Psychology | 2016

Have We Forgotten Auditory Sensory Memory? Retention Intervals in Studies of Nonverbal Auditory Working Memory

Michael A. Nees

Researchers have shown increased interest in mechanisms of working memory for nonverbal sounds such as music and environmental sounds. These studies often have used two-stimulus comparison tasks: two sounds separated by a brief retention interval (often 3–5 s) are compared, and a “same” or “different” judgment is recorded. Researchers seem to have assumed that sensory memory has a negligible impact on performance in auditory two-stimulus comparison tasks. This assumption is examined in detail in this comment. According to seminal texts and recent research reports, sensory memory persists in parallel with working memory for a period of time following hearing a stimulus and can influence behavioral responses on memory tasks. Unlike verbal working memory studies that use serial recall tasks, research paradigms for exploring nonverbal working memory—especially two-stimulus comparison tasks—may not be differentiating working memory from sensory memory processes in analyses of behavioral responses, because retention interval durations have not excluded the possibility that the sensory memory trace drives task performance. This conflation of different constructs may be one contributor to discrepant research findings and the resulting proliferation of theoretical conjectures regarding mechanisms of working memory for nonverbal sounds.


Psychonomic Bulletin & Review | 2017

Maintenance of memory for melodies: Articulation or attentional refreshing?

Michael A. Nees; Ellen Corrini; Peri Leong; Joanna Harris

Past research on the effects of articulatory suppression on working memory for nonverbal sounds has been characterized by discrepant findings, which suggests that multiple mechanisms may be involved in the rehearsal of nonverbal sounds. In two experiments we examined the potential roles of two theoretical mechanisms of verbal working memory—articulatory rehearsal and attentional refreshing—in the maintenance of memory for short melodies. In both experiments, participants performed a same–different melody comparison task. During an 8-s retention interval, interference tasks were introduced to suppress articulatory rehearsal, attentional refreshing, or both. In Experiment 1, only the conditions that featured articulatory suppression resulted in worse memory performance than in a control condition, and the suppression of both attentional refreshing and articulatory rehearsal concurrently did not impair memory more than articulatory suppression alone. Experiment 2 reproduced these findings and also confirmed that the locus of interference was articulatory and not auditory (i.e., the interference was not attributable to the sound of participants’ own voices during articulatory suppression). Both experiments suggested that articulatory rehearsal played a role in the maintenance of melodies in memory, whereas attentional refreshing did not. We discuss potential theoretical implications regarding the mechanisms used for the rehearsal of nonverbal sounds in working memory.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2018

Drivers’ Perceptions of Functionality Implied by Terms Used to Describe Automation in Vehicles

Michael A. Nees

The expectations induced by the labels used to describe vehicle automation are important to understand, because research has shown that expectations can affect trust in automation even before a person uses the system for the first time. An online sample of drivers rated the perceived division of driving responsibilities implied by common terms used to describe automation. Ratings of 13 terms were made on a scale from 1 (“human driver is entirely responsible”) to 7 (“vehicle is entirely responsible”) for three driving tasks (steering, accelerating/braking, and monitoring). In several instances, the functionality implied by automation terms did not match the technical definitions of the terms and/or the actual capabilities of the automated vehicle functions currently described by the terms. These exploratory findings may spur and guide future research on this under-examined topic.

Collaboration


Dive into the Michael A. Nees's collaboration.

Top Co-Authors

Avatar

Bruce N. Walker

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Myounghoon Jeon

Michigan Technological University

View shared research outputs
Top Co-Authors

Avatar

Thomas M. Gable

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Benjamin K. Davison

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge