Publications

A toolbox for calculating objective image properties in aesthetics research

Published in arXiv preprint, 2024

Over the past two decades, researchers in the field of visual aesthetics have studied numerous quantitative (objective) image properties and how they relate to visual aesthetic appreciation. However, results are difficult to compare between research groups. One reason is that researchers use different sets of image properties in their studies. But even if the same properties are used, the image pre-processing techniques may differ and often researchers use their own customized scripts to calculate the image properties. To provide greater accessibility and comparability of research results in visual experimental aesthetics, we developed an open-access and easy-to-use toolbox (called the ‘Aesthetics Toolbox’). The Toolbox allows users to calculate a well-defined set of quantitative image properties popular in contemporary research. The properties include lightness and color statistics, Fourier spectral properties, fractality, self-similarity, symmetry, as well as different entropy measures and CNN-based variances. Compatible with most devices, the Toolbox provides an intuitive click-and-drop web interface. In the Toolbox, we integrated the original scripts of four different research groups and translated them into Python 3. To ensure that results were consistent across analyses, we took care that results from the Python versions of the scripts were the same as those from the original scripts. The toolbox, detailed documentation, and a link to the cloud version are available via Github: this https URL. In summary, we developed a toolbox that helps to standardize and simplify the calculation of quantitative image properties for visual aesthetics research. Download paper here

Recommended citation: Redies, C., Bartho, R., Koßmann, L., Spehar, B., Hübner, R., Wagemans, J., & Hayn-Leichsenring, G. U. (2024). A toolbox for calculating objective image properties in aesthetics research. arXiv preprint arXiv:2408.10616.. https://arxiv.org/abs/2408.10616

Reconstructing a disambiguation sequence that forms perceptual memory of multistable displays via reverse correlation method: Bias onset perception but gently

Published in Journal of Vision, 2023

When multistable displays are presented intermittently with long blank intervals, their onset perception is determined by perceptual memory of multistable displays. We investigated when and how it is formed using a reverse correlation method and bistable kinetic depth effect displays. Each experimental block consisted of interleaved fully ambiguous probe and exogenously disambiguated prime displays. The purpose of the former was to “read out” the perceptual memory, whereas the latter contained purely random disambiguation sequences that were presented at the beginning of the prime display, throughout the entire presentation, or at the beginning and the end of the presentation. For each experiment and condition, we selected a subset of trials with disambiguation sequences that led to a change in perception of either the prime itself (sequences that modified perception) or the following fully ambiguous probe (sequences that modified perceptual memory). We estimated average disambiguation sequences for each participant using additive linear models. We found that an optimal sequence started at the onset with a moderate disambiguation against the previously dominant state (dominant perception for the previous probe) that gradually reduced until the display is fully ambiguous. We also show that the same sequence leads to an altered perception of the prime, indicating that perception and perceptual memory form at the same time. We suggest that perceptual memory is a consequence of an earlier evidence accumulation process and is informative about how the visual system treated ambiguity in the past rather than how it anticipates an uncertain future. Download paper here

Recommended citation: Pastukhov, A., Koßmann, L. & Carbon, C. (2023). Reconstructing a disambiguation sequence that forms perceptual memory of multistable displays via reverse correlation method: bias onset perception But gently. Journal of Vision, 23(3), 10. https://doi.org/10.1167/jov.23.3.10. https://jov.arvojournals.org/article.aspx?articleid=2785454

Perceptions of persons who wear face coverings are modulated by the perceivers’ attitude

Published in Frontiers in Neuroscience, 2022

We examined if the effect of facial coverings on person perception is influenced by the perceiver’s attitudes. We used two online experiments in which participants saw the same human target persons repeatedly appearing with and without a specific piece of clothing and had to judge the target persons’ character. In Experiment 1 (N = 101), we investigated how the wearing of a facial mask influences a person’s perception depending on the perceiver’s attitude toward measures against the COVID-19 pandemic. In Experiment 2 (N = 114), we examined the effect of wearing a head cover associated with Arabic culture on a person’s perception depending on the perceiver’s attitude toward Islam. Both studies were preregistered; both found evidence that a person’s perception is a process shaped by the personal attitudes of the perceiver as well as merely the target person’s outward appearance. Integrating previous findings, we demonstrate that facial covers, as well as head covers, operate as cues which are used by the perceivers to infer the target persons’ underlying attitudes. The judgment of the target person is shaped by the perceived attitude toward what the facial covering stereotypically symbolizes. Download paper here

Recommended citation: Leder, J., Koßmann, L. & Carbon, C. (2022). Perceptions of persons who wear face coverings are modulated by the perceivers’ attitude. Frontiers in Neuroscience, 16. https://doi.org/10.3389/fnins.2022.988546. https://www.frontiersin.org/articles/10.3389/fnins.2022.988546/full

Replicating Epley and Gilovich: Need for Cognition, Cognitive Load, and Forewarning do not Moderate Anchoring Effects

Published in PsyArXiv, 2022

Anchoring, the assimilation of numerical estimates toward previously considered numbers, has generally been separated into anchoring from self-generated anchors (e.g., people first thinking of 9 months when asked for the gestation period of an animal) and experimenter-provided anchors (e.g., experimenters letting participants spin fortune wheels). For some time, the two types of anchoring were believed to be explained by two different theoretical accounts. However, later research showed crossover between the accounts. What now remains are contradictions between past and recent findings, specifically, which moderators affect which type of anchoring. We conducted three replications (Ntotal = 653) of seminal studies on the distinction between self-generated and experimenter-provided anchoring effects where we investigated the moderators need for cognition, cognitive load, and forewarning. We found no evidence that either type of anchoring is moderated by any of the moderators. In line with recent replication efforts, we found that anchoring effects were robust, but the findings on moderators of anchoring effects should be treated with caution.

Recommended citation: Röseler, L., Bögler, H. L., Koßmann, L., Krueger, S., Bickenbach, S., Bühler, R., Guardia, J. d ., et al. (2022, April 13). Replicating Epley and Gilovich: Need for Cognition, Cognitive Load, and Forewarning do not Moderate Anchoring Effects. PsyArXiv. Retrieved from psyarxiv.com/bgp3m https://psyarxiv.com/bgp3m/

When perception is stronger than physics: Perceptual similarities rather than laws of physics govern the perception of interacting objects

Published in Attention, Perception & Psychophysics, 2021

When several multistable displays are viewed simultaneously, their perception is synchronized, as they tend to be in the same perceptual state. Here, we investigated the possibility that perception may reflect embedded statistical knowledge of physical interaction between objects for specific combinations of displays and layouts. We used a novel display with two ambiguously rotating gears and an ambiguous walker-on-a-ball display. Both stimuli produce a physically congruent perception when an interaction is possible (i.e., gears counterrotate, and the ball rolls under the walker’s feet). Next, we gradually manipulated the stimuli to either introduce abrupt changes to the potential physical interaction between objects or keep it constant despite changes in the visual stimulus. We characterized the data using four different models that assumed (1) independence of perception of the stimulus, (2) dependence on the stimulus’s properties, (3) dependence on physical configuration alone, and (4) an interaction between stimulus properties and a physical configuration. We observed that for the ambiguous gears, the perception was correlated with the stimulus changes rather than with the possibility of physical interaction. The perception of walker-on-a- ball was independent of the stimulus but depended instead on whether participants responded about a relative motion of two objects (perception was biased towards physically congruent motion) or the absolute motion of the walker alone (perception was independent of the rotation of the ball). None of the two experiments supported the idea of embedded knowledge of physical interaction.

Recommended citation: Pastukhov, A., Koßmann, L. & Carbon, CC. When perception is stronger than physics: Perceptual similarities rather than laws of physics govern the perception of interacting objects. Atten Percept Psychophys 84, 124–137 (2022). https://doi.org/10.3758/s13414-021-02383-1 https://link.springer.com/article/10.3758/s13414-021-02383-1