Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Elizabeth S. Olds is active.

Publication


Featured researches published by Elizabeth S. Olds.


Vision Research | 1998

Linearity across spatial frequency in object recognition.

Elizabeth S. Olds; Stephen A. Engel

In three experiments, we measured recognition as a function of exposure duration for three kinds of images of common objects: component images containing mainly low-spatial-frequency information, components containing mainly high-spatial-frequency information, and compound images created by summing the components. Our data were well fit by a model with a linear first stage in which the sums of the responses to the component images equalled the responses to the compound images. Our data were less well fit by a model in which the component responses combined by probability summation. These results support linear filter accounts of complex pattern recognition.


Psychonomic Bulletin & Review | 2000

Tracking visual search over space and time.

Elizabeth S. Olds; William B. Cowan; Pierre Jolicoeur

Visual perception consists of early preattentive processing and subsequent attention-demanding processing. Most researchers implicitly treat preattentive processing as a domain-dependent, indivisible stage. We show, however, by interrupting preattentive visual processing of color before its completion, that it can be dissected both temporally and spatially. The experiment depends on changing easy (preattentive) selection into difficult (attention-demanding) selection. We show that although the mechanism subserving preattentive selection completes processing as early as 200 msec after stimulus onset, partial selection information is available well before completion. Furthermore, partial selection occurs first at locations near fixation, spreading radially outward as processing proceeds.


Perception | 2004

Does Previewing One Stimulus Feature Help Conjunction Search

Elizabeth S. Olds; K Amanda Fockler

We examined the effects of previewing one aspect of a search display, in order to determine what subset of display information is most useful as a prelude to a search task. Observers were asked to indicate the presence or absence of a known target, in a conjunction search where the target was defined by the combination of colour and orientation (a yellow horizontal line presented among yellow vertical and pink horizontal distractors). In the colour preview condition of experiment 1, observers were first shown a 1 s preview of the locations and colours of the search items before the actual search set was presented. That is, search items first appeared as yellow and pink squares for 1 s, which each then turned into yellow and pink oriented lines (in the same locations) which comprised the display to be searched. In the orientation preview condition, observers were first shown a 1 s preview of the locations and orientations of the search items before the actual search display was presented. These two conditions were compared to a control condition consisting of standard conjunction search without any preview display. There was no effect of colour preview; there was a marginal effect of orientation preview, but in the opposite direction from what was expected—reaction time increased for orientation preview searches. In experiment 2 these previews were compared to two spatial cueing conditions; in this experiment the colour preview did provide a small amount of help. Finally, in experiment 3 both previews were presented in succession, and increased facilitation was found, in particular when the colour preview preceded the orientation preview. These findings are discussed in relation to the literature, in particular the Guided Search model (Wolfe et al 1989 Journal of Experimental Psychology: Human Perception and Performance 15 419–433; Wolfe 1994 Psychonomic Bulletin & Review 1 202–238).


Attention Perception & Psychophysics | 2000

Partial orientation pop-out helps difficult search for orientation.

Elizabeth S. Olds; William B. Cowan; Pierre Jolicœur

We interrupted pop-out search before it produced a detection response by adding extra distractors to the search display. We show that when pop-out for an orientation target fails because of this interruption, it nevertheless provides useful information to the processes responsible for difficult search. That is, partial pop-out assists difficult search. This interaction has also been found for color stimuli (Olds, Cowan, & Jolicœur, 2000a, 2000b). These results indicate that interactions and/or overlap between the mechanisms responsible for pop-out and the mechanisms responsible for difficult search may be quite general in early visual selection.


Attention Perception & Psychophysics | 1999

Stimulus-determined discrimination mechanisms for color search

Elizabeth S. Olds; William B. Cowan; Pierre Jolicoeur

Visual attention can be goal driven, stimulus driven, or a combination of the two. Here we report evidence for an unexpectedly stimulus-driven component of visual search for a target defined by color. Observers demonstrated a surprisingly cost-free ability to incorporate multiple classifiers in search for a target of one color from among distractors of other colors. A target color was presented among distractors that could change from trial to trial (intermixed presentation) or that remained constant across all trials in a block (blocked presentation). For blocked presentation, a single search classifier (a mechanism that segregates the target from distractors in color space) could be adopted, whereas for intermixed presentation different classifiers had to be used when the distractor colors changed. The benefit of blocked presentation was very small, suggesting that the appropriate classifier was determined very quickly in trials for which the classifier changed. The results suggest that the stimulus-driven activation of an appropriate stimulus classifier can be very efficient.


Vision Research | 2009

Feature head-start: Conjunction search following progressive feature disclosure.

Elizabeth S. Olds; Timothy J. Graham; Jeffery A. Jones

When a colour/orientation conjunction search display is immediately preceded by a display that shows either the colour or the orientation of each upcoming search item, search is faster after colour-preview than after orientation-preview. One explanation for this feature asymmetry is that colour has priority access to attentional selection relative to features such as orientation and size. In support of this hypothesis, we show that this asymmetry persists even after colour and orientation feature search performance is equated. However, this notion was ruled out by our subsequent experiments in which the target was defined by conjunction of colour and size; colour-preview was less helpful than size-preview (even though colour-feature search was faster than size-feature search, for these feature values). A final set of experiments tested size-preview vs. orientation-preview for size/orientation conjunction search, using stimuli for which orientation-feature search was easier than size-feature search. Size-preview produced much faster search than orientation-preview, demonstrating again that ease of feature search does not predict effects of a feature-preview. Overall, size produced the most facilitation when presented as a feature-preview (for both colour/size and size/orientation conjunctions), followed by colour (for colour/orientation conjunction but not for colour/size conjunction) and then orientation (which never facilitated search). Whilst each feature-preview may potentially facilitate search, the transition from feature-preview display to search display could disrupt search processes, because of luminance and/or colour changes. We see evidence for some sort of disruption when the feature-preview slows search. An explanation of this set of results must focus on both facilitation and disruption: these effects are not mutually exclusive, and neither suffices alone, since performance after feature-preview can be significantly better or significantly worse than conjunction baseline.


Attention Perception & Psychophysics | 2003

Does partial difficult search help difficult search

Elizabeth S. Olds; Mark D. Degani

Olds, Cowan, and Jolicoeur (2000a, 2000b) showed that exposure to a display that affordspop-out search (a target among distractors of only one color) can assist processing of a related display that requiresdifficult search. They added distractors of an additional color to the initial simple display and analyzed response time distributions to show that exposure to the initial display aided subsequent search in the difficult portion (this finding was calledsearch assistance). To test whether search assistance depends on perceptual grouping of the initial items, we presented initial items that were more difficult to group (two colors of distractors, instead of just one). The target appeared (on 50% of the trials) among distractors of two colors, and then after a delay, more distractors of those two colors were added to the display. Exposure to the initial easier portion of the display did not assist processing of the second portion of the display when the initial display contained a large number of items; we found tentative evidence for assistance with small numbers of initial items. In the Olds et al. (2000a, 2000b) studies, it was easy to group the initial distractor items, because they were all the same color. In contrast, in the present study, it was difficult to group the heterogeneous initial distractor items. Search assistance is found only when initial item grouping is relatively easy, and thus we conclude that search assistance depends on grouping.


Vision Research | 2005

Recognizing partially visible objects

Philip Servos; Elizabeth S. Olds; Peggy J. Planetta; G. Keith Humphrey

In two experiments we measured object recognition performance as a function of delay. In Experiment 1 we presented half of an image of an object, and then the other half after a variable delay. Objects were subdivided into top versus bottom halves, left versus right halves, or vertical strips. In Experiment 2 we separated the low (LSF) and high spatial frequency (HSF) components of an image, and presented one component followed by the other after a variable delay. For both experiments, performance was worse with a 105ms delay between the presentations of the object components than when the two components were presented simultaneously. These results are consistent with predictions made by models that combine information at a relatively early stage in processing. In addition, the results revealed that object recognition performance is significantly better when the LSF sub-image preceded the HSF sub-image than when the HSF sub-image preceded the LSF sub-image, consistent with previous work suggesting that LSF information is processed prior to HSF in object recognition.


Perception | 2003

Search of Jumping Items: Visual Marking and Discrete Motion

Elizabeth S. Olds; C. Meghan McMurtry

Watson and Humphreys (1997 Psychological Review 104 90 – 122) showed that when searching for a target, observers can ignore a previewed set of distractors (other items), effectively decreasing the number of relevant items in a difficult search display and thus speeding performance (‘visual marking’). Other researchers have more recently investigated visual marking for continuously moving items, finding that shared features, and preserved inter-item spatial relationships, are helpful. Here, we tested whether visual marking occurs for a set of initial items that moves in one discrete jump (preserving shared features and inter-item spatial relationships). Marking did not occur in these displays, and we interpret this result in the context of previous research on visual marking.


Journal of the Optical Society of America | 1999

Effective color CRT calibration techniques for perception research

Elizabeth S. Olds; William B. Cowan; Pierre Jolicoeur

Collaboration


Dive into the Elizabeth S. Olds's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jeffery A. Jones

Wilfrid Laurier University

View shared research outputs
Top Co-Authors

Avatar

Timothy J. Graham

Wilfrid Laurier University

View shared research outputs
Top Co-Authors

Avatar

Angela M. Weber

Wilfrid Laurier University

View shared research outputs
Top Co-Authors

Avatar

Mark D. Degani

Wilfrid Laurier University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

G. Keith Humphrey

University of Western Ontario

View shared research outputs
Top Co-Authors

Avatar

K Amanda Fockler

Wilfrid Laurier University

View shared research outputs
Top Co-Authors

Avatar

P Jolicceur

University of Waterloo

View shared research outputs
Researchain Logo
Decentralizing Knowledge