Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gregory M. Burnett is active.

Publication


Featured researches published by Gregory M. Burnett.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2012

Individual Differences in Multimodal Waypoint Navigation

Andre Garcia; Victor Finomore; Gregory M. Burnett; Carryl L. Baldwin; Christopher Brill

Waypoint navigation is a critical task for dismounted soldiers, especially when navigating through novel environments with potential threats. In these dangerous environments, the soldiers should have their “eyes-up” and “ears-out” scanning the environment for critical signals. Current practices for dismounted soldiers include the use of a compass and map or small wearable computer in order to navigate. In this experiment, we compared several modalities and multiple combinations of these modalities in waypoint navigation performance. These modalities include two visual (an egocentric and a geocentric map), 3D spatialized audio, tactile, and the multimodal combinations of each. We also examined individual differences in sense of direction as a potential moderator of display usage. Results provide preliminary evidence that localized 3D audio and haptics navigation aids are an intuitive, efficient, and effective means of waypoint navigation, regardless of sense of direction.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2013

Evaluation of a Mobile Application for Multimodal Land Navigation

Andrés A. Calvo; Victor Finomore; Gregory M. Burnett; Thomas McNitt

Among many of their countless dismounted roles, U.S. Air Force Battlefield Airmen must navigate in unfamiliar environments with many potential threats while performing their mission objectives. The effectiveness of navigating with a map and compass is compromised in reduced visibility conditions, such as in fog or during nighttime operations. Moreover, focusing on a map draws attention away from the immediate surroundings and reduces their ability to detect threats. To ameliorate this problem, we prototyped auditory and tactile navigation displays controlled only by a mobile phone using its built-in GPS and compass. The auditory and tactile displays direct users towards their destination with 3D audio and vibrotactile cues, respectively. We evaluated the navigation displays on a waypoint navigation task. Results suggest that auditory and tactile displays can guide users to their destination as effectively as a visual display (i.e., a GPS enabled map). Initial findings justify further development of multimodal navigation displays to increase the efficiency of Battlefield Airmen in land navigation tasks.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2014

Demonstration and Evaluation of an Eyes-Free Mobile Navigation System

Andrés A. Calvo; Victor Finomore; Thomas McNitt; Gregory M. Burnett

Loss of awareness in one’s immediate surroundings can have devastating results when navigating. For instance, military operators must often navigate in unfamiliar environments and must be able to detect nearby threats to survive. Visual displays such as paper or digital maps can draw visual attention away from one’s environment. We developed a navigation display that guides a user through a series of waypoints by playing a 3D audio tone over headphones or vibrating a tactor on an array around the torso. We evaluated the navigation display by having participants navigate through 32 waypoints in an open field. In addition to evaluating auditory and vibrotactile cues, we considered an analog visual cue, an allocentric map, and an egocentric map. Participants were able to reach all waypoints in every condition. Results suggest that the participants reached waypoints fastest with the egocentric map. Additionally, participants were slightly faster with the auditory cue than with the vibrotactile cue. Subjective workload and usability questionnaires found that both of these conditions were not mentally demanding and highly usable. These results help support the development of eye-free mobile navigation tools.


Proceedings of the Human Factors and Ergonomics Society 56th Annual Meeting | 2012

The Design, Implementation, and Evaluation of a Pointing Device For a Wearable Computer

Andrés A. Calvo; Gregory M. Burnett; Victor Finomore; Saverio Perugini

U.S. Air Force special tactics operators at times use small wearable computers (SWCs) for mission objectives. The primary pointing device of a SWC is either a touchpad or trackpoint, which is embedded into the chassis of the SWC. In situations where the user cannot directly interact with these pointing devices, the utility of the SWC is decreased. We developed a pointing device called the G3 that can be used for SWCs used by operators. The device utilizes gyroscopic sensors attached to the user’s index finger to move the computer cursor according to the angular velocity of his finger. We showed that, as measured by Fitts’ law, the overall performance and accuracy of the G3 was better than that of the touchpad and trackpoint. These findings suggest that the G3 can adequately be used with SWCs. Additionally, we investigated the G3 ’s utility as a control device for operating micro remotely piloted aircrafts


international conference on distributed ambient and pervasive interactions | 2013

The Effects of Multimodal Mobile Communications on Cooperative Team Interactions Executing Distributed Tasks

Gregory M. Burnett; Andrés A. Calvo; Victor Finomore; Gregory J. Funke

Mobile devices are rapidly becoming an indispensible part of our everyday life. Integrated with various embedded sensors and the ability to support on-the-move processing, mobile devices are being investigated as potential tools to support cooperative team interactions and distributed real-time decision making in both military and civilian applications. A driving interest is how a mobile device equipped with multimodal communication capabilities can contribute to the effectiveness and efficiency of real-time, task outcome and performance. In this paper, we investigate the effects of a prototype multimodal collaborative Android application on distributed collaborating partners jointly working on a physical task. The mobile applications implementation supports real-time data dissemination of an active workspaces perspective between distributed operators. The prototype application was demonstrated in a scenario where teammates utilize different features of the software to collaboratively assemble a complex structure. Results indicated significant improvements in completion times when users visually shared their perspectives and were able to utilize image annotation versus relying on verbal descriptors.


mobile computing, applications, and services | 2012

Multimodal Mobile Collaboration Prototype Used in a Find, Fix, and Tag Scenario

Gregory M. Burnett; Thomas Wischgoll; Victor Finomore; Andrés A. Calvo

Given recent technological advancements in mobile devices, military research initiatives are investigating these devices as a means to support multimodal cooperative interactions. Military components are executing dynamic combat and humanitarian missions while dismounted and on the move. Paramount to their success is timely and effective information sharing and mission planning to enact more effective actions. In this paper, we describe a prototype multimodal collaborative Android application. The mobile application was designed to support real-time battlefield perspective, acquisition, and dissemination of information among distributed operators. The prototype application was demonstrated in a scenario where teammates utilize different features of the software to collaboratively identify and deploy a virtual tracker-type device on hostile entities. Results showed significant improvements in completion times when users visually shared their perspectives versus relying on verbal descriptors. Additionally, the use of shared video significantly reduced the required utterances to complete the task.


Proceedings of SPIE | 2012

Ergonomic design considerations for an optical data link between a warfighter's head and body-worn technologies

Noel Trew; Gregory M. Burnett; Michael Sedillo; Candace Washington; Aaron Linn; Zachery Nelson

Today, warfighters are burdened by a web of cables linking technologies that span the head and torso regions of the body. These cables help to provide interoperability between helmet-worn peripherals such as head mounted displays (HMDs), cameras, and communication equipment with chest-worn computers and radios. Although promoting enhanced capabilities, this cabling also poses snag hazards and makes it difficult for the warfighter to extricate himself from his kit when necessary. A newly developed wireless personal area network (WPAN), one that uses optical transceivers, may prove to be an acceptable alternative to traditional cabling. Researchers at the Air Force Research Laboratorys 711th Human Performance Wing are exploring how best to mount the WPAN transceivers to the body in order to facilitate unimpeded data transfer while also maintaining the operators natural range of motion. This report describes the two-step research process used to identify the performance limitations and usability of a body-worn optical wireless system. Firstly, researchers characterized the field of view for the current generation of optical WPAN transceivers. Then, this field of view was compared with anthropometric data describing the range of motion of the cervical vertebrae to see if the data link would be lost at the extremes of an operators head movement. Finally, this report includes an additional discussion of other possible military applications for an optical WPAN.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2011

Tools for Battlefield Airmen Where we’ve been and where we’re going

Jill Ritter; MSgt Robert Bean; Gregory M. Burnett; Randy Mieskoski; Victor Finomore; Laura G. Militello

Air Force Special Operation Command’s Battlefield Airmen (BA) work in physically and cognitive demanding conditions. They conduct asymmetric and irregular war operations that require advanced technologies to enhance situational awareness and battlefield effectiveness. Recent efforts have focused on providing tools for the Air Force Combat Controllers, one type of BA. Advances in technology include a wearable computer, advanced targeting capability, physiological sensors, lightweight power and ergonomically superior load carriage concepts. This work will be leveraged to expand the scope to assess the needs of the Pararescue Jumper or PJ. This panel will include a combination of researchers and operational personnel reviewing previous successes, outlining the operational challenges of a PJ, painting a vision for the next phases, and drawing from cognitive research in related settings.


Archive | 2013

Load carriage connector and system

Gregory M. Burnett; Michael R. Sedillo


Archive | 2013

Appendage-mounted display apparatus

Michael R. Sedillo; Gregory M. Burnett

Collaboration


Dive into the Gregory M. Burnett's collaboration.

Top Co-Authors

Avatar

Victor Finomore

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Andrés A. Calvo

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andre Garcia

George Mason University

View shared research outputs
Top Co-Authors

Avatar

Candace Washington

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas McNitt

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Brill

Old Dominion University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge