Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where L. Caitlin Elmore is active.

Publication


Featured researches published by L. Caitlin Elmore.


Experimental Brain Research | 2009

Sound enhances touch perception.

Tony Ro; Johanan Hsu; Nafi E. Yasar; L. Caitlin Elmore; Michael S. Beauchamp

Certain sounds, such as fingernails screeching down a chalkboard, have a strong association with somatosensory percepts. In order to assess the influences of audition on somatosensory perception, three experiments measured how task-irrelevant auditory stimuli alter detection rates for near-threshold somatosensory stimuli. In Experiment 1, we showed that a simultaneous auditory stimulus increases sensitivity, but not response biases, to the detection of an electrical cutaneous stimulus delivered to the hand. Experiment 2 demonstrated that this enhancement of somatosensory perception is spatially specific—only monaural sounds on the same side increased detection. Experiment 3 revealed that the effects of audition on touch are also frequency dependent—only sounds with the same frequency as the vibrotactile frequency enhanced tactile detection. These results indicate that auditory information influences touch perception in highly systematic ways and suggest that similar coding mechanisms may underlie the processing of information from these different sensory modalities.


Psychonomic Bulletin & Review | 2010

Testing pigeon memory in a change detection task

Anthony A. Wright; Jeffrey S. Katz; John F. Magnotti; L. Caitlin Elmore; Stephanie Babb; Sarah alwin

Six pigeons were trained in a change detection task with four colors. They were shown two colored circles on a sample array, followed by a test array with the color of one circle changed. The pigeons learned to choose the changed color and transferred their performance to four unfamiliar colors, suggesting that they had learned a generalized concept of color change. They also transferred performance to test delays several times their 50-msec training delay without prior delay training. The accurate delay performance of several seconds suggests that their change detection was memory based, as opposed to a perceptual attentional capture process. These experiments are the first to show that an animal species (pigeons, in this case) can learn a change detection task identical to ones used to test human memory, thereby providing the possibility of directly comparing short-term memory processing across species.


Learning & Behavior | 2009

Individual differences: Either relational learning or item-specific learning in a same/different task

L. Caitlin Elmore; Anthony A. Wright; Jacquelyne J. Rivera; Jeffrey S. Katz

Three pigeons were trained in a three-item simultaneous same/different task. Three of six stimulus combinations were not trained (untrained set) and were tested later. Following acquisition, the subjects were tested with novel stimuli, the untrained set, training-stimulus inversions, and object shape and color manipulations. There was no novel-stimulus transfer—that is, no abstract-concept learning. Two pigeons showed partial transfer to untrained pairs and good transfer to stimulus inversions, suggesting that they had learned the relationship between the stimuli. Lack of transfer by the third pigeon suggests item-specific learning. The somewhat surprising finding of relational learning by 2 pigeons with only six training pairs suggests restricted-domain relational learning that was controlled more by color than by shape features. Individual differences of item-specific learning by 1 pigeon and relational learning by 2 others demonstrate that this task can be learned in different ways and that relational learning can occur in the absence of novel-stimulus transfer.


Journal of experimental psychology. Animal learning and cognition | 2015

Monkey visual short-term memory directly compared to humans.

L. Caitlin Elmore; Anthony A. Wright

Two adult rhesus monkeys were trained to detect which item in an array of memory items had changed using the same stimuli, viewing times, and delays as used with humans. Although the monkeys were extensively trained, they were less accurate than humans with the same array sizes (2, 4, & 6 items), with both stimulus types (colored squares, clip art), and showed calculated memory capacities of about 1 item (or less). Nevertheless, the memory results from both monkeys and humans for both stimulus types were well characterized by the inverse power-law of display size. This characterization provides a simple and straightforward summary of a fundamental process of visual short-term memory (STM; how VSTM declines with memory load) that emphasizes species similarities based upon similar functional relationships. By more closely matching monkey testing parameters to those of humans, the similar functional relationships strengthen the evidence suggesting similar processes underlying monkey and human VSTM.


Behavioural Processes | 2013

Visual object complexity limits pigeon short-term memory.

John F. Magnotti; Adam M. Goodman; Thomas A. Daniel; L. Caitlin Elmore; Anthony A. Wright; Jeffrey S. Katz

The study of visual memory has repeatedly shown qualitatively similar visual short-term memory (VSTM) systems between human and many nonhuman species. In studies of human VSTM using change detection, increasing visual object complexity has an inverse effect on accuracy. In the current study, we assessed the functional relationship between visual object complexity and memory performance in visual change detection in pigeons and humans. Visual object complexity was quantified for each object type within each species using visual target search. Change detection performance was inversely related to object complexity in both species, suggesting that pigeon VSTM, like human VSTM, is limited by visual object complexity. Human participants were able to use a verbal-labeling strategy to mitigate some of the effect of visual object complexity, suggesting a qualitative difference in how the two species may solve certain visual discriminations. Considering the visual complexity of novel objects may also help explain previous failures to transfer relational rules to novel visual objects.


Behavioural Processes | 2013

Change detection for the study of object and location memory.

L. Caitlin Elmore; Antony D. Passaro; Anthony A. Wright

Seven adult human participants were tested in change detection tasks for object and location memory with large and small sets of four different stimulus types. Blocked tests demonstrated that participants performed similarly in separate object and location tests with matched parameters and displays. In mixed tests, participants were informed that they would be tested with either object changes or location changes; surprisingly, they were nearly as accurate remembering both objects and locations as when either was tested alone. By contrast, in the large-set condition, performance was lower than baseline on surprise probe test trials in which participants were tested (on 13% of trials) with the change type opposite to the present block (e.g., location probe trials during the object change block). These probe-test results were further supported by the reduction in probe-baseline differences when tested with small sets (6) of these item types. Small sets required remembering locations and objects to resolve object-location confounds. Together these results show that humans can remember both objects and locations with little loss of accuracy when instructed to do so, but do not learn these contextual associations without instruction.


Animal Cognition | 2013

Testing visual short-term memory of pigeons (Columba livia) and a rhesus monkey (Macaca mulatta) with a location change detection task

Kenneth J. Leising; L. Caitlin Elmore; Jacquelyne J. Rivera; John F. Magnotti; Jeffrey S. Katz; Anthony A. Wright

Change detection is commonly used to assess capacity (number of objects) of human visual short-term memory (VSTM). Comparisons with the performance of non-human animals completing similar tasks have shown similarities and differences in object-based VSTM, which is only one aspect (“what”) of memory. Another important aspect of memory, which has received less attention, is spatial short-term memory for “where” an object is in space. In this article, we show for the first time that a monkey and pigeons can be accurately trained to identify location changes, much as humans do, in change detection tasks similar to those used to test object capacity of VSTM. The subject’s task was to identify (touch/peck) an item that changed location across a brief delay. Both the monkey and pigeons showed transfer to delays longer than the training delay, to greater and smaller distance changes than in training, and to novel colors. These results are the first to demonstrate location-change detection in any non-human species and encourage comparative investigations into the nature of spatial and visual short-term memory.


Current Biology | 2011

Visual Short-Term Memory Compared in Rhesus Monkeys and Humans

L. Caitlin Elmore; Wei Ji Ma; John F. Magnotti; Kenneth J. Leising; Antony D. Passaro; Jeffrey S. Katz; Anthony A. Wright


Journal of Comparative Psychology | 2012

Change detection by rhesus monkeys (Macaca mulatta) and pigeons (Columba livia).

L. Caitlin Elmore; John F. Magnotti; Jeffrey S. Katz; Anthony A. Wright


Behavioural Processes | 2016

Pigeon visual short-term memory directly compared to primates

Anthony A. Wright; L. Caitlin Elmore

Collaboration


Dive into the L. Caitlin Elmore's collaboration.

Top Co-Authors

Avatar

Anthony A. Wright

University of Texas Health Science Center at Houston

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John F. Magnotti

Baylor College of Medicine

View shared research outputs
Top Co-Authors

Avatar

Antony D. Passaro

University of Texas Health Science Center at Houston

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jacquelyne J. Rivera

University of Texas Health Science Center at Houston

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge