Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christiane B. Wiebel is active.

Publication


Featured researches published by Christiane B. Wiebel.


Journal of Vision | 2013

Perceptual qualities and material classes.

Roland W. Fleming; Christiane B. Wiebel; Karl R. Gegenfurtner

Under typical viewing conditions, we can easily group materials into distinct classes (e.g., woods, plastics, textiles). Additionally, we can also make many other judgments about material properties (e.g., hardness, rigidity, colorfulness). Although these two types of judgment (classification and inferring material properties) have different requirements, they likely facilitate one another. We conducted two experiments to investigate the interactions between material classification and judgments of material qualities in both the visual and semantic domains. In Experiment 1, nine students viewed 130 images of materials from 10 different classes. For each image, they rated nine subjective properties (glossiness, transparency, colorfulness, roughness, hardness, coldness, fragility, naturalness, prettiness). In Experiment 2, 65 subjects were given the verbal names of six material classes, which they rated in terms of 42 adjectives describing material qualities. In both experiments, there was notable agreement between subjects, and a relatively small number of factors (weighted combinations of different qualities) were substantially independent of one another. Despite the difficulty of classifying materials from images (Liu, Sharan, Adelson, & Rosenholtz, 2010), the different classes were well clustered in the feature space defined by the subjective ratings. K-means clustering could correctly identify class membership for over 90% of the samples, based on the average ratings across subjects. We also found a high degree of consistency between the two tasks, suggesting subjects access similar information about materials whether judging their qualities visually or from memory. Together, these findings show that perceptual qualities are well defined, distinct, and systematically related to material class membership.


Multisensory Research | 2013

Visual and Haptic Representations of Material Properties

Elisabeth Baumgartner; Christiane B. Wiebel; Karl R. Gegenfurtner

Research on material perception has received an increasing amount of attention recently. Clearly, both the visual and the haptic sense play important roles in the perception of materials, yet it is still unclear how both senses compare in material perception tasks. Here, we set out to investigate the degree of correspondence between the visual and the haptic representations of different materials. We asked participants to both categorize and rate 84 different materials for several material properties. In the haptic case, participants were blindfolded and asked to assess the materials based on haptic exploration. In the visual condition, participants assessed the stimuli based on their visual impressions only. While categorization performance was less consistent in the haptic condition than in the visual one, ratings correlated highly between the visual and the haptic modality. PCA revealed that all material samples were similarly organized within the perceptual space in both modalities. Moreover, in both senses the first two principal components were dominated by hardness and roughness. These are two material features that are fundamental for the haptic sense. We conclude that although the haptic sense seems to be crucial for material perception, the information it can gather alone might not be quite fine-grained and rich enough for perfect material recognition.


Attention Perception & Psychophysics | 2013

The speed and accuracy of material recognition in natural images

Christiane B. Wiebel; Matteo Valsecchi; Karl R. Gegenfurtner

We studied the time course of material categorization in natural images relative to superordinate and basic-level object categorization, using a backward-masking paradigm. We manipulated several low-level features of the images—including luminance, contrast, and color—to assess their potential contributions. The results showed that the speed of material categorization was roughly comparable to the speed of basic-level object categorization, but slower than that of superordinate object categorization. The performance seemed to be crucially mediated by low-level factors, with color leading to a solid increase in performance for material categorization. At longer presentation durations, material categorization was less accurate than both types of object categorization. Taken together, our results show that material categorization can be as fast as basic-level object categorization, but is less accurate.


Vision Research | 2015

Statistical correlates of perceived gloss in natural images.

Christiane B. Wiebel; Matteo Toscani; Karl R. Gegenfurtner

It is currently debated whether the perception of gloss is linked to the statistical parameters of the retinal image. In particular, it has been suggested that gloss is highly correlated with the skewness of the luminance histogram. However, other psychophysical work with artificial stimuli has shown that skewness alone is not enough to induce the perception of gloss. Here, we analyzed many images of natural surfaces to search for potential statistical correlates of perceived gloss. We found that skewness indeed correlates with gloss when using rendered stimuli, but that the standard deviation, a measure of contrast, correlates better with perceived gloss when using photographs of natural surfaces. We verified the important role of contrast by manipulating skewness and contrast within images. Changing the contrast in images significantly modulates perceived gloss, but manipulating the skewness of the luminance histogram had only a small effect.


Journal of Vision | 2014

Early differential processing of material images: Evidence from ERP classification.

Christiane B. Wiebel; Matteo Valsecchi; Karl R. Gegenfurtner

Investigating the temporal dynamics of natural image processing using event-related potentials (ERPs) has a long tradition in object recognition research. In a classical Go-NoGo task two characteristic effects have been emphasized: an early task independent category effect and a later task-dependent target effect. Here, we set out to use this well-established Go-NoGo paradigm to study the time course of material categorization. Material perception has gained more and more interest over the years as its importance in natural viewing conditions has been ignored for a long time. In addition to analyzing standard ERPs, we conducted a single trial ERP pattern analysis. To validate this procedure, we also measured ERPs in two object categories (people and animals). Our linear classification procedure was able to largely capture the overall pattern of results from the canonical analysis of the ERPs and even extend it. We replicate the known target effect (differential Go-NoGo potential at frontal sites) for the material images. Furthermore, we observe task-independent differential activity between the two material categories as early as 140 ms after stimulus onset. Using our linear classification approach, we show that material categories can be differentiated consistently based on the ERP pattern in single trials around 100 ms after stimulus onset, independent of the target-related status. This strengthens the idea of early differential visual processing of material categories independent of the task, probably due to differences in low-level image properties and suggests pattern classification of ERP topographies as a strong instrument for investigating electrophysiological brain activity.


Journal of Vision | 2016

Testing the role of Michelson contrast for the perception of surface lightness.

Christiane B. Wiebel; Manish Singh; Marianne Maertens

It is still an unresolved question how the visual system perceives surface lightness given the ambiguity of the sensory input signal. We studied lightness perception using two-dimensional images of variegated checkerboards shown as perspective projections of three-dimensional objects. We manipulated the contrast of a target check relative to its surround either by rendering the image under different viewing conditions or by introducing noncoincidental changes of the reflectance of the surfaces adjacent to the target. We examined the predictive power of the normalized contrast model (Zeiner & Maertens, 2014) for the different viewing conditions (plain view vs. dark and light transparency) as well as for the noncoincidental surround changes (only high or only low reflectances in the surround). The model accounted for lightness matches across different viewing conditions but not for the surround changes. The observed simultaneous contrast effects were smaller than what would be predicted by the model. We evaluated two model extensions that-both relying on contrast-predicted the observed data well. Both model extensions point to the importance of contrast statistics across space and/or time for the computation of lightness, but it awaits future testing to evaluate whether and how the visual system could represent such statistics.


Journal of Vision | 2017

Maximum likelihood difference scales represent perceptual magnitudes and predict appearance matches

Christiane B. Wiebel; Guillermo Aguilar; Marianne Maertens

One central problem in perception research is to understand how internal experiences are linked to physical variables. Most commonly, this relationship is measured using the method of adjustment, but this has two shortcomings: The perceptual scales that relate physical and perceptual variables are not measured directly, and the method often requires perceptual comparisons between viewing conditions. To overcome these problems, we measured perceptual scales of surface lightness using maximum likelihood difference scaling, asking observers only to compare the lightness of surfaces presented in the same context. Observers were lightness constant, and the perceptual scales qualitatively and quantitatively predicted perceptual matches obtained in a conventional adjustment experiment. Additionally, we show that a contrast-based model of lightness perception predicted 98% of the variance in the scaling and 88% in the matching data. We suggest that the predictive power was higher for scales because they are closer to the true variables of interest.


Vision Research | 2015

A comparison of haptic material perception in blind and sighted individuals.

Elisabeth Baumgartner; Christiane B. Wiebel; Karl R. Gegenfurtner


Journal of Vision | 2013

The perception of gloss in natural images

Karl R. Gegenfurtner; Elisabeth Baumgartner; Christiane B. Wiebel


Journal of Vision | 2013

Visual and Haptic Representations of Material Qualities

Christiane B. Wiebel; Elisabeth Baumgartner; Karl R. Gegenfurtner

Collaboration


Dive into the Christiane B. Wiebel's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marianne Maertens

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Guillermo Aguilar

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge