Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Walter F. Bischof is active.

Publication


Featured researches published by Walter F. Bischof.


Computers and Biomedical Research | 1991

Automated detection of breast tumors using the asymmetry approach

Tin-Kit Lau; Walter F. Bischof

A method for automated detection of breast tumors in mammograms is presented. The method uses the asymmetry principle: Strong structural asymmetries between corresponding regions in the left and right breast are taken as evidence for the possible presence of a tumor in that region. Asymmetry detection is achieved in two steps. First, mammograms are aligned, compensating for possible differences in size and shape between the two breasts. Second, asymmetry between corresponding positions is determined using a combination of several asymmetry measures, each responding to different types of asymmetries. Results obtained with a set of mammograms indicate that this method can improve the sensitivity and reliability of systems for automated detection of breast tumors.


Visual Cognition | 2008

Gaze selection in complex social scenes

Elina Birmingham; Walter F. Bischof; Alan Kingstone

A great deal of recent research has sought to understand the factors and neural systems that mediate the orienting of spatial attention to a gazed-at location. What have rarely been examined, however, are the factors that are critical to the initial selection of gaze information from complex visual scenes. For instance, is gaze prioritized relative to other possible body parts and objects within a scene? The present study springboards from the seminal work of Yarbus (1965/1967), who had originally examined participants’ scan paths while they viewed visual scenes containing one or more people. His work suggested to us that the selection of gaze information may depend on the task that is assigned to participants, the social content of the scene, and/or the activity level depicted within the scene. Our results show clearly that all of these factors can significantly modulate the selection of gaze information. Specifically, the selection of gaze was enhanced when the task was to describe the social attention within a scene, and when the social content and activity level in a scene were high. Nevertheless, it is also the case that participants always selected gaze information more than any other stimulus. Our study has broad implications for future investigations of social attention as well as resolving a number of longstanding issues that had undermined the classic original work of Yarbus.


Quarterly Journal of Experimental Psychology | 2008

Social attention and real-world scenes: The roles of action, competition and social content

Elina Birmingham; Walter F. Bischof; Alan Kingstone

The present study examined how social attention is influenced by social content and the presence of items that are available for attention. We monitored observers’ eye movements while they freely viewed real-world social scenes containing either 1 or 3 people situated among a variety of objects. Building from the work of Yarbus (1965/1967) we hypothesized that observers would demonstrate a preferential bias to fixate the eyes of the people in the scene, although other items would also receive attention. In addition, we hypothesized that fixations to the eyes would increase as the social content (i.e., number of people) increased. Both hypotheses were supported by the data, and we also found that the level of activity in the scene influenced attention to eyes when social content was high. The present results provide support for the notion that the eyes are selected by others in order to extract social information. Our study also suggests a simple and surreptitious methodology for studying social attention to real-world stimuli in a range of populations, such as those with autism spectrum disorders.


Nature | 1998

Common reference frame for neural coding of translational and rotational optic flow.

Douglas R. Wylie; Walter F. Bischof; Barrie J. Frost

Self-movement of an organism through the environment is guided jointly by information provided by the vestibular system and by visual pathways that are specialized for detecting ‘optic flow’,. Motion of any object through space, including the self-motion of organisms, can be described with reference to six degrees of freedom: rotation about three orthogonal axes, and translation along these axes. Here we describe neurons in the pigeon brain that respond best to optic flow resulting from translation along one of the three orthogonal axes. We show that these translational optic flow neurons, like rotational optic flow neurons, share a common spatial frame of reference with the semicircular canals of the vestibular system. The three axes to which these neurons respond best are the vertical axis and two horizontal axes orientated at 45° to either side of the body midline.


Psychonomic Bulletin & Review | 1999

The attentional blink with targets in different spatial locations

Troy A. W. Visser; Samantha M. Zuvic; Walter F. Bischof; Vincent Di Lollo

When two targets (T1 and T2) are displayed in rapid succession, accuracy of T2 identification varies as a function of the temporal lag between the targets (attentional blink, AB). In some studies, performance has been found to be most impaired at Lag 1—namely, when T2 followed T1 directly. In other studies, T2 performance at Lag 1 has been virtually unimpaired (Lag 1 sparing). In the present work, we examined how Lag 1 sparing is affected by attentional switches between targets displayed in the same location or in different locations. We found that Lag 1 sparing does not occur when a spatial shift is required between T1 and T2. This suggests that attention cannot be switched to a new location while the system is busy processing another stimulus. The results are explained by a modified version of an attentional gating model (Chun & Potter, 1995; Shapiro & Raymond, 1994).


Attention Perception & Psychophysics | 2004

Rapid serial visual distraction: task-irrelevant items can produce an attentional blink.

Troy A. W. Visser; Walter F. Bischof; Vincent Di Lollo

When two sequential targets (T1 and T2) are presented within about 600 msec, perception of the second target is impaired. This attentional blink (AB) has been studied by means of two paradigms: rapid serial visual presentation (RSVP), in which targets are embedded in a stream of central distractors, and the two-target paradigm, in which targets are presented eccentrically without distractors. We examined the role of distractors in the AB, using a modified two-target paradigm with a central stream of task-irrelevant distractors. In six experiments, the RSVP stream of distractors substantially impaired identification of both T1 and T2, but only when the distractors shared common characteristics with the targets. Without such commonalities, the distractors had no effect on performance. This points to the subjects’ attentional control setting as an important factor in the AB deficit and suggests a conceptual link between the AB and a form of nonspatial contingent capture attributable to distractor processing.


Computers and Biomedical Research | 1992

Automated detection and classification of breast tumors

Shun Leung Ng; Walter F. Bischof

This paper presents a new method for the mammographic detection and classification of two types of breast tumors, stellate lesions and circumscribed lesions. The method assumes that both types of tumors appear as approximately circular, bright masses with a fuzzy boundary and that stellate lesions are in addition surrounded by a radiating structure of sharp, fine lines. Experimental results for a set of 27 mammograms are presented and the method is shown to have a high detection rate and an extremely low false positive rate.


Vision Research | 1990

Perception of directional sampled motion in relation to displacement and spatial frequency: Evidence for a unitary motion system

Walter F. Bischof; Vincent Di Lollo

Perception of directional motion was studied by displaying two images (F1 and F2) in rapid succession. The two images were identical except for a horizontal displacement of F2 with respect to F1. Observers reported the direction of horizontal motion over a wide range of displacements. The stimuli in Experiment 1 were one-dimensional gratings with spatial frequency between 0.125 and 6 c/deg. Motion was seen at all displacements to almost 0.5 cycles (counterphase) and remained invariant across spatial frequencies. In Experiment 2 the stimuli were band-pass filtered random-dot patterns. The bandwidth of the filters was 1 octave, and centre frequencies ranged from 0.75 to 12 c/deg. In every case, the response functions exhibited quasi-periodic oscillations related to structural properties of the images. One-dimensional analyses based on autocorrelation did not provide a satisfactory account of the data. By contrast, the data were fitted successfully by a two-dimensional analysis that integrated the responses of neighbouring motion detectors so as to yield a smooth motion flow field from which left-right directional motion could be derived. Practically and conceptually, the outcome supports a unitary motion system as distinct from separate systems subserving short-range and long-range motion.


Archive | 1997

Machine learning and image interpretation

Terry Caelli; Walter F. Bischof

In this groundbreaking new volume, computer researchers discuss the development of technologies and specific systems that can interpret data with respect to domain knowledge. Although the chapters each illuminate different aspects of image interpretation, all utilize a common approach - one that asserts such interpretation must involve perceptual learning in terms of automated knowledge acquisition and application, as well as feedback and consistency checks between encoding, feature extraction, and the known knowledge structures in a given application domain. The text is profusely illustrated with numerous figures and tables to reinforce the concepts discussed.


Visual Cognition | 2009

Get real! Resolving the debate about equivalent social stimuli

Elina Birmingham; Walter F. Bischof; Alan Kingstone

Gaze and arrow studies of spatial orienting have shown that eyes and arrows produce nearly identical effects on shifts of spatial attention. This has led some researchers to suggest that the human attention system considers eyes and arrows as equivalent social stimuli. However, this view does not fit with the general intuition that eyes are unique social stimuli nor does it agree with a large body of work indicating that humans possess a neural system that is preferentially biased to process information regarding human gaze. To shed light on this discrepancy we entertained the idea that the model cueing task may fail to measure some of the ways that eyes are special. Thus rather than measuring the orienting of attention to a location cued by eyes and arrows, we measured the selection of eyes and arrows embedded in complex real-world scenes. The results were unequivocal: People prefer to look at other people and their eyes; they rarely attend to arrows. This outcome was not predicted by visual saliency but it was predicted by the idea that eyes are social stimuli that are prioritized by the attention system. These data, and the paradigm from which they were derived, shed new light on past cueing studies of social attention, and they suggest a new direction for future investigations of social attention.

Collaboration


Dive into the Walter F. Bischof's collaboration.

Top Co-Authors

Avatar

Alan Kingstone

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jason J. S. Barton

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge