Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James Gips is active.

Publication


Featured researches published by James Gips.


Cyberpsychology, Behavior, and Social Networking | 2011

Media Multitasking Behavior: Concurrent Television and Computer Usage

S. Adam Brasel; James Gips

Changes in the media landscape have made simultaneous usage of the computer and television increasingly commonplace, but little research has explored how individuals navigate this media multitasking environment. Prior work suggests that self-insight may be limited in media consumption and multitasking environments, reinforcing a rising need for direct observational research. A laboratory experiment recorded both younger and older individuals as they used a computer and television concurrently, multitasking across television and Internet content. Results show that individuals are attending primarily to the computer during media multitasking. Although gazes last longer on the computer when compared to the television, the overall distribution of gazes is strongly skewed toward very short gazes only a few seconds in duration. People switched between media at an extreme rate, averaging more than 4 switches per min and 120 switches over the 27.5-minute study exposure. Participants had little insight into their switching activity and recalled their switching behavior at an average of only 12 percent of their actual switching rate revealed in the objective data. Younger individuals switched more often than older individuals, but other individual differences such as stated multitasking preference and polychronicity had little effect on switching patterns or gaze duration. This overall pattern of results highlights the importance of exploring new media environments, such as the current drive toward media multitasking, and reinforces that self-monitoring, post hoc surveying, and lay theory may offer only limited insight into how individuals interact with media.


Universal Access in The Information Society | 2003

Communication via eye blinks and eyebrow raises: video-based human-computer interfaces

Kristen Grauman; Margrit Betke; Jonathan Lombardi; James Gips; Gary R. Bradski

Two video-based human-computer interaction tools are introduced that can activate a binary switch and issue a selection command. “BlinkLink,” as the first tool is called, automatically detects a user’s eye blinks and accurately measures their durations. The system is intended to provide an alternate input modality to allow people with severe disabilities to access a computer. Voluntary long blinks trigger mouse clicks, while involuntary short blinks are ignored. The system enables communication using “blink patterns:” sequences of long and short blinks which are interpreted as semiotic messages. The second tool, “EyebrowClicker,” automatically detects when a user raises his or her eyebrows and then triggers a mouse click. Both systems can initialize themselves, track the eyes at frame rate, and recover in the event of errors. No special lighting is required. The systems have been tested with interactive games and a spelling program. Results demonstrate overall detection accuracy of 95.6% for BlinkLink and 89.0% for EyebrowClicker.


Environment and Planning B-planning & Design | 1980

Production systems and grammars: a uniform characterization

James Gips; George Stiny

The common structure underlying production system formalisms is developed. A variety of production system formalisms are summarized in terms of this structure. The structure is useful both for understanding existing types of production systems and for developing new ones.


International Journal of Psychophysiology | 1998

Eye movement control of computer functions

Joseph J. Tecce; James Gips; Olivieri Cp; Pok Lj; Consiglio Mr

The control of computer functions by eye movements was demonstrated in 14 normal volunteers. Electrical potentials recorded by horizontal and vertical electrooculography (EOG) were transformed into a cursor that represented a moving fixation point on a computer display. Subjects were able to spell words and sentences by using eye movements to place the cursor on target letters in the display of an alphabet matrix. The successful demonstration of computer-controlled syntactic construction by eye movements offers a potentially useful technique for computer-assisted communication in special groups, such as developmentally-disabled individuals who have motor paralysis and who cannot speak.


systems man and cybernetics | 2008

A Human–Computer Interface Using Symmetry Between Eyes to Detect Gaze Direction

John J. Magee; Margrit Betke; James Gips; Matthew R. Scott; Benjamin N. Waber

In the cases of paralysis so severe that a persons ability to control movement is limited to the muscles around the eyes, eye movements or blinks are the only way for the person to communicate. Interfaces that assist in such communication are often intrusive, require special hardware, or rely on active infrared illumination. A nonintrusive communication interface system called EyeKeys was therefore developed, which runs on a consumer-grade computer with video input from an inexpensive Universal Serial Bus camera and works without special lighting. The system detects and tracks the persons face using multiscale template correlation. The symmetry between left and right eyes is exploited to detect if the person is looking at the camera or to the left or right side. The detected eye direction can then be used to control applications such as spelling programs or games. The game ldquoBlockEscaperdquo was developed to evaluate the performance of EyeKeys and compare it to a mouse substitution interface. Experiments with EyeKeys have shown that it is an easily used computer input and control device for able-bodied people and has the potential to become a practical tool for people with severe paralysis.


workshop on applications of computer vision | 2002

Evaluation of tracking methods for human-computer interaction

Christopher Fagiani; Margrit Betke; James Gips

Tracking methods are evaluated in a real-time feature tracking system used for human-computer interaction (HCI). The Camera Mouse, a HCI system for people with severe disabilities that interprets video input to manipulate the mouse pointer, was improved and used as the test platform for this study. Tracking methods tested are the Lucas-Kanade tracker and a tracker based on normalized correlation. Both methods are evaluated with and without multidimensional Kalman filters. Two-, four-, and six-dimensional filters are tested to model feature location, velocity, and acceleration. The various tracker and filter combinations are evaluated for accuracy, computational efficiency, and practicality. The normalized correlation coefficient tracker without Kalman filtering is found to be the tracker best suited for a variety of HCI tasks.


Lecture Notes in Computer Science | 1998

On Building Intelligence into EagleEyes

James Gips

EagleEyes is a system that allows the user to control the computer through electrodes placed on the head. For people without disabilities it takes 15 to 30 minutes to learn to control the cursor sufficiently to spell out a message with an onscreen keyboard. We currently are working with two dozen children with profound disabilities to teach them to use EagleEyes to control computer software for entertainment, communication, and education. We have had some dramatic successes.


Environment and Planning B-planning & Design | 1978

An Evaluation of Palladian Plans

George Stiny; James Gips

Criteria are suggested for the aesthetic evaluation of Palladian villa plans. These criteria are applied to two catalogues of all possible plans constructed on underlying grids of sizes 3 × 3 and 5 × 3 respectively.


international conference on computers for handicapped persons | 2004

The Effect of Assistive Technology on Educational Costs: Two Case Studies

Amy Gips; Philip DiMattia; James Gips

Until recently children with very profound disabilities-children who cannot speak and can move only their eyes or head-could be made comfortable, but by and large could not be educated. Assistive technologies now enable them to communicate and to be educated alongside their non-disabled peers. This is a wonderful development. But what is the financial cost? In this paper we look in detail at the costs associated with the education of two children who have used assistive technologies developed at our university and compare them with the educational costs had they not started using the assistive technologies. For these two children the costs of the technologies and special teachers hired are offset by savings from the tuition and transportation of sending them to special schools.


Leonardo | 1975

AN INVESTIGATION OF ALGORITHMIC AESTHETICS

James Gips; George Stiny

This paper describes an investigation of aesthetics in terms of algorithms. The main result of the work is the development of a formalism, an aesthetic system, for the algorithmic specification of aesthetic viewpoints. An aesthetic viewpoint is taken to consist of a collection of interpretative conventions and evaluative criteria for art. Interpretative conventions determine how an object can be understood as a work of art; evaluative criteria determine the judged quality of such an object when it is understood in this way. Viewpoints vary for different people, different cultures and different art forms. These systems allow for the specification of different viewpoints in terms of a uniformly structured system of algorithms. An example of an aesthetic system for nonfigurative, rectilinear pictures is given. This system is applied to six pictures shown in the paper. The computer implementation of the system is described briefly.

Collaboration


Dive into the James Gips's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

George Stiny

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Holly A. Yanco

University of Massachusetts Lowell

View shared research outputs
Top Co-Authors

Avatar

Kristen Grauman

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Benjamin N. Waber

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge