Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Aleksandra Sherman is active.

Publication


Featured researches published by Aleksandra Sherman.


Seeing and Perceiving | 2012

The Role of Stereopsis, Motion Parallax, Perspective and Angle Polarity in Perceiving 3-D Shape

Aleksandra Sherman; Thomas V. Papathomas; Anshul Jain; Brian P. Keane

We studied how stimulus attributes (angle polarity and perspective) and data-driven signals (motion parallax and binocular disparity) affect recovery of 3-D shape. We used physical stimuli, which consisted of two congruent trapezoids forming a dihedral angle. To study the effects of the stimulus attributes, we used 2 × 2 combinations of convex/concave angles and proper/reverse perspective cues. To study the effects of binocular disparity and motion parallax, we used 2 × 2 combinations of monocular/binocular viewing with moving/stationary observers. The task was to report the depth of the right vertical edge relative to a fixation point positioned at a different depth. In Experiment 1 observers also had the option of reporting that the right vertical edge and fixation point were at the same depth. However, in Experiment 2, observers were only given two response options: is the right vertical edge in front of/behind the fixation point? We found that across all stimulus configurations, perspective is a stronger cue than angle polarity in recovering 3-D shape; we also confirm the bias to perceive convex compared to concave angles. In terms of data-driven signals, binocular disparity recovered 3-D shape better than motion parallax. Interestingly, motion parallax improved performance for monocular viewing but not for binocular viewing.


Psychonomic Bulletin & Review | 2012

Laughter exaggerates happy and sad faces depending on visual context

Aleksandra Sherman; Timothy D. Sweeny; Marcia Grabowecky; Satoru Suzuki

Laughter is an auditory stimulus that powerfully conveys positive emotion. We investigated how laughter influenced the visual perception of facial expressions. We presented a sound clip of laughter simultaneously with a happy, a neutral, or a sad schematic face. The emotional face was briefly presented either alone or among a crowd of neutral faces. We used a matching method to determine how laughter influenced the perceived intensity of the happy, neutral, and sad expressions. For a single face, laughter increased the perceived intensity of a happy expression. Surprisingly, for a crowd of faces, laughter produced an opposite effect, increasing the perceived intensity of a sad expression in a crowd. A follow-up experiment revealed that this contrast effect may have occurred because laughter made the neutral distractor faces appear slightly happy, thereby making the deviant sad expression stand out in contrast. A control experiment ruled out semantic mediation of the laughter effects. Our demonstration of the strong context dependence of laughter effects on facial expression perception encourages a reexamination of the previously demonstrated effects of prosody, speech content, and mood on face perception, as they may be similarly context dependent.


Psychonomic Bulletin & Review | 2013

Auditory rhythms are systemically associated with spatial-frequency and density information in visual scenes

Aleksandra Sherman; Marcia Grabowecky; Satoru Suzuki

A variety of perceptual correspondences between auditory and visual features have been reported, but few studies have investigated how rhythm, an auditory feature defined purely by dynamics relevant to speech and music, interacts with visual features. Here, we demonstrate a novel crossmodal association between auditory rhythm and visual clutter. Participants were shown a variety of visual scenes from diverse categories and asked to report the auditory rhythm that perceptually matched each scene by adjusting the rate of amplitude modulation (AM) of a sound. Participants matched each scene to a specific AM rate with surprising consistency. A spatial-frequency analysis showed that scenes with greater contrast energy in midrange spatial frequencies were matched to faster AM rates. Bandpass-filtering the scenes indicated that greater contrast energy in this spatial-frequency range was associated with an abundance of object boundaries and contours, suggesting that participants matched more cluttered scenes to faster AM rates. Consistent with this hypothesis, AM-rate matches were strongly correlated with perceived clutter. Additional results indicated that both AM-rate matches and perceived clutter depend on object-based (cycles per object) rather than retinal (cycles per degree of visual angle) spatial frequency. Taken together, these results suggest a systematic crossmodal association between auditory rhythm, representing density in the temporal domain, and visual clutter, representing object-based density in the spatial domain. This association may allow for the use of auditory rhythm to influence how visual clutter is perceived and attended.


Journal of Experimental Psychology: Human Perception and Performance | 2015

In the Working Memory of the Beholder: Art Appreciation Is Enhanced When Visual Complexity Is Compatible With Working Memory

Aleksandra Sherman; Marcia Grabowecky; Satoru Suzuki

What shapes art appreciation? Much research has focused on the importance of visual features themselves (e.g., symmetry, natural scene statistics) and of the viewers experience and expertise with specific artworks. However, even after taking these factors into account, there are considerable individual differences in art preferences. Our new result suggests that art preference is also influenced by the compatibility between visual properties and the characteristics of the viewers visual system. Specifically, we have demonstrated, using 120 artworks from diverse periods, cultures, genres, and styles, that art appreciation is increased when the level of visual complexity within an artwork is compatible with the viewers visual working memory capacity. The result highlights the importance of the interaction between visual features and the beholders general visual capacity in shaping art appreciation.


Science | 2017

Culturally inclusive STEM education

Amanda J. Zellmer; Aleksandra Sherman

Brazil’s government is reopening bidding rounds for deep-sea oil and gas exploration after 4 years of economic and political turmoil. According to its ambitious 4-year plan (1), through which the government expects to profit from licenses and production royalties, Brazil will lease hundreds of offshore areas for exploration in depths below 200 m. Allowing such exploration will substantially expand the offshore industry in regions that are of biological and ecological relevance for deep-sea conservation (2). In deep-sea basins within Brazil’s exclusive economic zone, where more than 70% of the current offshore oil production is concentrated, there is already substantial overlap between leased areas and vulnerable marine ecosystems, including cold-water corals and submarine canyons (3). Other vulnerable ecosystems, such as cold seeps, are poorly reported even in basins where there is biological and geophysical evidence for their presence (4, 5). The spatial overlap and depth distribution of pockmarks within oil fields suggest that seeps may be common in areas currently offered on bidding rounds. As a result, the planned expansion of offshore leasing areas will increase the impacts of the offshore oil industry from the Amazon to the Edited by Jennifer Sills Deep-sea drilling threatens cold seep ecosystems.


PLOS ONE | 2017

Pattern Classification of EEG Signals Reveals Perceptual and Attentional States

Alexandra List; Aleksandra Sherman; Monica D. Rosenberg; Michael Esterman

Pattern classification techniques have been widely used to differentiate neural activity associated with different perceptual, attentional, or other cognitive states, often using fMRI, but more recently with EEG as well. Although these methods have identified EEG patterns (i.e., scalp topographies of EEG signals occurring at certain latencies) that decode perceptual and attentional states on a trial-by-trial basis, they have yet to be applied to the spatial scope of attention toward global or local features of the display. Here, we initially used pattern classification to replicate and extend the findings that perceptual states could be reliably decoded from EEG. We found that visual perceptual states, including stimulus location and object category, could be decoded with high accuracy peaking between 125–250 ms, and that the discriminative spatiotemporal patterns mirrored and extended our (and other well-established) ERP results. Next, we used pattern classification to investigate whether spatiotemporal EEG signals could reliably predict attentional states, and particularly, the scope of attention. The EEG data were reliably differentiated for local versus global attention on a trial-by-trial basis, emerging as a specific spatiotemporal activation pattern over posterior electrode sites during the 250–750 ms interval after stimulus onset. In sum, we demonstrate that multivariate pattern analysis of EEG, which reveals unique spatiotemporal patterns of neural activity distinguishing between behavioral states, is a sensitive tool for characterizing the neural correlates of perception and attention.


Seeing and Perceiving | 2012

Natural scenes have matched amplitude-modulated sounds that systematically influence visual scanning

Marcia Grabowecky; Satoru Suzuki; Aleksandra Sherman

We have previously demonstrated a linear perceptual relationship between auditory amplitude-modulation (AM) rate and visual spatial-frequency using gabors as the visual stimuli. Can this frequency-based auditory–visual association influence perception of natural scenes? Participants consistently matched specific auditory AM rates to diverse visual scenes (nature, urban, and indoor). A correlation analysis indicated that higher subjective density ratings were associated with faster AM-rate matches. Furthermore, both the density ratings and AM-rate matches were relatively scale invariant, suggesting that the underlying crossmodal association is between visual coding of object-based density and auditory coding of AM rate. Based on these results, we hypothesized that concurrently presented fast (7 Hz) or slow (2 Hz) AM-rates might influence how visual attention is allocated to dense or sparse regions within a scene. We tested this hypothesis by monitoring eye movements while participants examined scenes for a subsequent memory task. To determine whether fast or slow sounds guided eye movements to specific spatial frequencies, we computed the maximum contrast energy at each fixation across 12 spatial frequency bands ranging from 0.06–10.16 cycles/degree. We found that the fast sound significantly guided eye movements toward regions of high spatial frequency, whereas the slow sound guided eye movements away from regions of high spatial frequency. This suggests that faster sounds may promote a local scene scanning strategy, acting as a ‘filter’ to individuate objects within dense regions. Our results suggest that auditory AM rate and visual object density are crossmodally associated, and that this association can modulate visual inspection of scenes.


Frontiers in Human Neuroscience | 2017

What Is Art Good For? The Socio-Epistemic Value of Art

Aleksandra Sherman; Clair Morrissey


Journal of Vision | 2010

Motion induced pitch: a case of visual-auditory synesthesia

Casey Noble; Julia Mossbridge; Lucica Iordanescu; Aleksandra Sherman; Alexandra List; Marcia Grabowecky; Satoru Suzuki


Journal of Vision | 2010

Neural signatures of local and global biases induced by automatic versus controlled attention

Alexandra List; Aleksandra Sherman; Anastasia V. Flevaris; Marcia Grabowecky; Satoru Suzuki

Collaboration


Dive into the Aleksandra Sherman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Casey Noble

Northwestern University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge