Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Grant R. McMillan is active.

Publication


Featured researches published by Grant R. McMillan.


Proceedings Third Annual Symposium on Human Interaction with Complex Systems. HICS'96 | 1996

EEG-based control for human-computer interaction

Gloria L. Calhoun; Grant R. McMillan

An interface whereby brain responses can control machines has been developed by the Armstrong Laboratory. This EEG-based control uses the magnitude of the steady-state visual evoked response (SSVER) as a control signal. The SSVER is identified and monitored using non-invasive scalp electrodes and advanced signal processing technology. With biofeedback, users learn to increase or decrease the magnitude of the VSYER to an evoking stimulus. These responses are translated into commands that control the operation of a physical device or computer program. After further development this innovative interface could revolutionize human interaction with complex systems.


Ergonomics | 1986

Psychophysical methods for equating performance between alternative motion simulators

John M. Flach; Gary E. Riccio; Grant R. McMillan; Rik Warren

Abstract Psychophysical matching techniques were employed to equate the subjective experience of motion in two roll-axis motion simulation devices: the RATS, a whole-body motion environment, and the dynamic seat sub-system of the ALCOGS, presenting motion cues through a moving seat pan. Two psychophysical techniques, cross-modality matching and magnitude estimation, yielded similar results. These results indicated that motion sensitivity increased with roll angular frequency for both simulators. However, the rate of increase at high frequencies was greater for the RATS than for the dynamic seat. These results were used to design a filter for the dynamic seat which enhanced high-frequency signal components. Tests in a roll-axis tracking task showed that performance in the dynamic seat using this filter was both quantitatively (in terms of r.m.s. error) and qualitatively (in terms of frequency characteristics) similar to performance in the whole-body motion environment.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1998

Evaluation of an Electroencephalographic-Based Control Device

Keith S. Jones; Matthew S. Middendorf; Gloria L. Calhoun; Grant R. McMillan

Electroencephalographic (EEG)-based control devices are one of several emerging technologies that will provide operators with a variety of new hands-free control options. In general, EEG-based control translates brain electrical activity into a control signal. The system evaluated in this study uses the steady-state visual evoked response for system control. The luminance of selectable items on a computer display was modulated at different frequencies. The operators choice between these items was identified by detecting which frequency pattern was dominant in the visual evoked brain activity. One objective of this study was to characterize the performance of this human-machine system. In addition, two candidate control frequencies were evaluated. The results are encouraging. Participants were able to use this form of EEG-based control and performance was stable. Participants averaged over 90 percent correct selections. Future development will focus on increasing the speed and accuracy with which this novel hands-free controller can be utilized.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1996

An Initial Evaluation of a Direct Vestibular Display in a Virtual Environment

Jeffrey D. Cress; Lawrence J. Hettinger; James A. Cunningham; Gary E. Riccio; Grant R. McMillan; Michael W. Haas

The US Air Force Armstrong Laboratorys Human Interface Technology Branch is currently investigating the development and potential application of direct vestibular displays. The Electrical Vestibular Stimulus (EVS) technology described in this paper uses electrodes located behind the ears to deliver a low-level electrical current in the area of the eighth cranial nerve of the central nervous system to produce a compelling sensation of roll motion about the bodys fore-aft axis. In this study, subjects experienced the EVS display while simultaneously observing a large field-of-view visual roll display, and were asked to rate various aspects of quality and magnitude of self-motion. The two displays were driven in a sinusoidal fashion at various phase relationships relative to one another. Results revealed that the fidelity of the motion experience depended upon the phase relationship between the two displays. Results also indicated that when an appropriate phase relationship was used, the vestibular display significantly improved the fidelity of the motion experience when compared to a visual-only display.


Journal of Spinal Cord Medicine | 1999

Preliminary electrophysiological characterization of functionally vestigial muscles of the head: Potential for command signaling

Richard N. Friedman; Grant R. McMillan; John C. Kincaid; Ralph M. Buschbacher

In devastating neurological disorders, such as quadriplegia resulting from high-level spinal cord injury, it is essential to focus on functions that have been spared and optimally exploit them to enhance the individuals quality of life. It follows that certain muscles, which prior to the paralysis of much of the rest of the body seemed to have no useful function, might be used to provide unique signals to control assistive devices. This report presents preliminary electrophysiological data demonstrating potentially useful myoelectrical signals from 3 functionally vestigial muscles in humans; the posterior, anterior, and superior auricular muscles. In phylogenetically lower species, these muscles serve to position the ear to enhance hearing. The auricular muscles receive their major innervation from cranial nerve VII and should not be compromised by even high-level spinal cord lesions. In this study, it was found that the muscles could be voluntarily activated and, by standard surface-electrode recording, had potentials ranging to 680 microV in amplitude. Posterior auricular muscle potentials were used to command a paddle in a computer ping-pong task that employed a CyberLink interface. The t values for accuracy scores and ball hits were both significant at the p = .0001 level. These facts indicate that the auricular muscles may be useful for controlling assistive devices.


IEEE Computer Graphics and Applications | 1997

Integrating vestibular displays for VE and airborne applications

Jeffrey D. Cress; Lawrence J. Hettinger; James A. Cunningham; Gary E. Riccio; Michael W. Haas; Grant R. McMillan

Designing effective human machine interfaces is one of the more challenging and exciting issues facing engineers today. Many virtual environment (VE) designers approach this problem by examining the way humans interact with their natural environment. In essence, they have attempted to mimic the various ways humans use their senses to gather information. This approach seeks to present task relevant information in a form that is familiar, compellingly realistic, and intuitive. We have taken this design approach for the development of a direct vestibular display. We intend to provide self motion information to the vestibular system in a static virtual environment-that is, to simulate not only the look but also the feel of such sensations as turning and swaying. The article explains our rationale and method for doing so and reports results from our experiments using a visual vestibular interface.


Interacting with Computers | 2003

Comparing mouse and steady-state visual evoked response-based control

Keith S. Jones; Matthew S. Middendorf; Grant R. McMillan; Gloria L. Calhoun; Joel S. Warm

Abstract Future computers will be more mobile, which will require new interaction methods. Accordingly, one might harness electroencephalographic (EEG) activity for computer control. Such devices exist, but all have limitations. Therefore, a novel EEG-based control was tested, which monitors the Steady-State Visual Evoked Response (SSVER). Selections are attempted by fixating a flickering target. A selection occurs if a SSVER is detected. To assess the devices relative performance, a mouse and the SSVER-based control were used to acquire targets of various sizes and distances. Accuracy and speed were measured. Overall, accuracy was poorer and acquisition times were longer with the SSVER-based control. However, the performance levels attained by the SSVER-based control might be adequate when manual controls are problematic, such as in assistive technology applications. In addition, in contrast to the mouse, SSVER-based acquisition times were insensitive to variations in target distance, which might serve as an operational advantage in certain applications.


ieee virtual reality conference | 1997

An introduction of a direct vestibular display into a virtual environment

Jeffrey D. Cress; Lawrence J. Hettinger; James A. Cunningham; Gary E. Riccio; Grant R. McMillan; Michael W. Haas

The US Air Force Armstrong Synthesized Immersion Research Environment Facility is currently investigating the development and potential application of direct vestibular displays. The Electrical Vestibular Stimulus (EVS) technology described in the paper uses electrodes located behind the ears to deliver a low level electrical current in the vicinity of the eighth cranial nerve of the central nervous system to produce a compelling sensation of roll motion about the bodys fore-aft axis. In the study described, subjects experienced the EVS display while simultaneously observing a large field of view visual display which depicted curvilinear motion through a tunnel. Both EVS and visual displays were driven in a sinusoidal fashion at various phase relationships relative to one another. After observing the two displays, subjects were asked to rate various aspects of quality and magnitude of self motion. Results revealed that the fidelity of the motion experience depended upon the phase relationship between the EVS and visual displays. Results also indicated that when an appropriate phase relationship was used, the vestibular display significantly improved the fidelity of the motion experience when compared to a visual only display.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1999

Comparison of Hands-Free versus Conventional Wearable Computer Control for Maintenance Applications

Grant R. McMillan; Gloria L. Calhoun; Barbara L. Masquelier; Scott S. Grigsby; Laurie L. Quill; David E. Kancler; Allen R. Revels

Past research on wearable computers for maintenance applications has focused on developing displays and presentation formats. This study emphasized wearable computer control technologies. Alternative control technologies were compared with standard and voice controls. Twelve subjects performed a synthetic maintenance task using three control device combinations for three different types of input. Time and error data were collected. The results show that for pointer movement, standard controls took significantly longer than voice. For discrete input, standard controls required significantly more time than voice and alternative controls. However, there were no significant time differences among controllers for text entry fill-in. Error results showed no significant differences. This research suggests that alternative and voice controls provide similar performance levels and both are superior to standard controls. In environments with changing noise spectra and noise levels such as a flight line, the alternative control suite provides hands-free control that complements voice without sacrificing performance.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 1999

EEG-Based control of Virtual Buttons

Matthew S. Middendorf; Keith S. Jones; Gloria L. Calhoun; Grant R. McMillan

Conventional methods for performing tasks on a computer typically include a keyboard for text entry and a mouse for pointing and selecting. In contrast to these commonly used controls are what have been referred to as alternative controls (McMillan, Eggleston and Anderson, 1997). Most of these devices are designed to be useful in hands-busy environments or when conventional controls are less accessible. The Alternative Control Technology program, located in the Air Force Research Laboratory, Wright-Patterson AFB, Ohio, has developed an alternative controller that translates electrical activity from the brain into a control signal. This electroencephalograph (EEG)-based system allows users to select virtual buttons on a computer screen simply by looking at the desired button. A virtual button is a small area of the screen, similar to an icon, that has a function associated with it. Control inputs are achieved by modulating the luminance of the virtual buttons at different frequencies, thereby causing a frequency-specific steady-state visual evoked response (SSVER) to appear in the user’s EEG when the user gazes on a button. The SSVER is characterized by an increase in amplitude at the luminance modulation frequency (Regan, 1989). Once an SSVER is reliably detected, the corresponding virtual button is selected and the associated function is performed. Because this system capitalizes on naturally occurring SSVER amplitudes at multiple frequencies, user training is not required.

Collaboration


Dive into the Grant R. McMillan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gloria L. Calhoun

Wright-Patterson Air Force Base

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Rik Warren

Wright-Patterson Air Force Base

View shared research outputs
Top Co-Authors

Avatar

Keith S. Jones

University of Cincinnati

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael W. Haas

Wright-Patterson Air Force Base

View shared research outputs
Top Co-Authors

Avatar

Allen R. Revels

University of Dayton Research Institute

View shared research outputs
Top Co-Authors

Avatar

Barbara L. Masquelier

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

David E. Kancler

University of Dayton Research Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge