Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Birgitta Burger is active.

Publication


Featured researches published by Birgitta Burger.


Frontiers in Psychology | 2013

Influences of rhythm- and timbre-related musical features on characteristics of music-induced movement.

Birgitta Burger; Marc R. Thompson; Geoff Luck; Suvi Saarikallio; Petri Toiviainen

Music makes us move. Several factors can affect the characteristics of such movements, including individual factors or musical features. For this study, we investigated the effect of rhythm- and timbre-related musical features as well as tempo on movement characteristics. Sixty participants were presented with 30 musical stimuli representing different styles of popular music, and instructed to move along with the music. Optical motion capture was used to record participants’ movements. Subsequently, eight movement features and four rhythm- and timbre-related musical features were computationally extracted from the data, while the tempo was assessed in a perceptual experiment. A subsequent correlational analysis revealed that, for instance, clear pulses seemed to be embodied with the whole body, i.e., by using various movement types of different body parts, whereas spectral flux and percussiveness were found to be more distinctly related to certain body parts, such as head and hand movement. A series of ANOVAs with the stimuli being divided into three groups of five stimuli each based on the tempo revealed no significant differences between the groups, suggesting that the tempo of our stimuli set failed to have an effect on the movement features. In general, the results can be linked to the framework of embodied music cognition, as they show that body movements are used to reflect, imitate, and predict musical characteristics.


Journal on Multimodal User Interfaces | 2010

Communication of Musical Expression by Means of Mobile Robot Gestures

Birgitta Burger; Roberto Bresin

We developed a robotic system that can behave in an emotional way. A 3-wheeled simple robot with limited degrees of freedom was designed. Our goal was to make the robot displaying emotions in music performance by performing expressive movements. These movements have been compiled and programmed based on literature about emotion in music, musicians’ movements in expressive performances, and object shapes that convey different emotional intentions. The emotions happiness, anger, and sadness have been implemented in this way. General results from behavioral experiments show that emotional intentions can be synthesized, displayed and communicated by an artificial creature, also in constrained circumstances.


Psychological Research-psychologische Forschung | 2018

Synchronization to metrical levels in music depends on low-frequency spectral components and tempo

Birgitta Burger; Justin London; Marc R. Thompson; Petri Toiviainen

Previous studies have found relationships between music-induced movement and musical characteristics on more general levels, such as tempo or pulse clarity. This study focused on synchronization abilities to music of finely-varying tempi and varying degrees of low-frequency spectral change/flux. Excerpts from six classic Motown/R&B songs at three different tempos (105, 115, and 130 BPM) were used as stimuli in this experiment. Each was then time-stretched by a factor of 5% with regard to the original tempo, yielding a total of 12 stimuli that were presented to 30 participants. Participants were asked to move along with the stimuli while being recorded with an optical motion capture system. Synchronization analysis was performed relative to the beat and the bar level of the music and four body parts. Results suggest that participants synchronized different body parts to specific metrical levels; in particular, vertical movements of hip and feet were synchronized to the beat level when the music contained large amounts of low-frequency spectral flux and had a slower tempo, while synchronization of head and hands was more tightly coupled to the weak flux stimuli at the bar level. Synchronization was generally more tightly coupled to the slower versions of the same stimuli, while synchronization showed an inverted u-shape effect at the bar level as tempo increased. These results indicate complex relationships between musical characteristics, in particular regarding metrical and temporal structure, and our ability to synchronize and entrain to such musical stimuli.


Musicae Scientiae | 2014

Emotion-driven encoding of music preference and personality in dance

Geoff Luck; Suvi Saarikallio; Birgitta Burger; Marc R. Thompson; Petri Toiviainen

Thirty rhythmic music excerpts were presented to 60 individuals. Dance movements to each excerpt were recorded using an optical motion-capture system, preference for each excerpt recorded on a 5-point Likert scale, and personality assessed using the 44-item version of the Big Five Inventory. From the movement data, a large number of postural, kinematic and kinetic features were extracted, a subset of which were chosen for further analysis using sequential backward elimination with variance inflation factor (VIF) selection. Multivariate analyses revealed significant effects on these 11 features of both preference and personality, as well as a number of interactions between the two. As regards preference, a U-shaped curvilinear relationship between excerpt preference and amount of movement was identified, hypothesized to relate to the role of emotional arousal in guiding music preference and dance moves. As regards personality, a different pattern of movement characteristics was associated with each of the Big Five dimensions, broadly supporting previous work.


Journal of New Music Research | 2018

EDM and Ecstasy: the lived experiences of electronic dance music festival attendees

Noah Little; Birgitta Burger; Stephen M. Croucher

Abstract Attendance at large-scale music festivals has captivated a global interest in these spectacular experiences, yet little is known about the lasting benefits and personal changes individuals incur following this event. This study aims to provide a comprehensive exploration of the lived experiences of individuals who attended a multi-day electronic dance music festival. The present study was primarily interested in the perceived beneficial changes within the individual, following their festival experience. We investigated if first-time festival attendees perceived changes differed to those of returning individuals. Semi-structured qualitative interviews were used to collect data from 12 individuals who attended the 2015 Electronic Daisy Carnival in Las Vegas. Six participants were first-time attendees while the remaining six were individuals returning to the festival. The data was analysed using Thematic Analysis. Within the data emerged the following central themes: (1) escape, (2) communitas, and (3) self-reported changes; there were 10 subthemes. These findings add to the existing body of music festival literature, further contextualizing how music festivals are both experienced, and reflected upon by individuals. Further, this study highlights the potential lasting changes individuals’ experience from attending electronic dance music festivals.


Musicae Scientiae | 2018

Embodiment in Electronic Dance Music: Effects of musical content and structure on body movement

Birgitta Burger; Petri Toiviainen

Electronic dance music (EDM) is music produced with the foremost aim to make people move. While research has revealed relationships between movement features and, for example, musical, emotional, or personality characteristics, systematic investigations of genre differences and specifically of EDM are rather rare. This article aims at offering insights into the embodiment of EDM from three different angles: first from a genre-comparison perspective, then by comparing different EDM stimuli with each other, and finally by investigating embodiments in one specific EDM stimulus. Sixty participants moved freely to 16 stimuli of four different genres (EDM, Latin, Funk, Jazz – four stimuli/genre) while being recorded with an optical motion capture system. Subsequently, a set of movement features was extracted from the motion capture data. Results indicate that participants moved with significantly higher acceleration of torso, head, hands, and feet and more overall movement to the EDM stimuli than to the other genres. Between EDM stimuli, several significant correlations were found, suggesting an increase in acceleration of different body parts with clearer and more percussive rhythmic structures and brighter sounds. Within one EDM stimulus, participants’ movements differed in several movement features distinguishing the break from surrounding sections, showing less acceleration, as well as less overall movement and rotational speed during the break. These analyses propose different ways of studying EDM and indicate distinctive characteristics of EDM embodiment.


Psychomusicology: Music, Mind and Brain | 2017

Personality and musical preference using social-tagging in excerpt-selection.

Emily Carlson; Pasi Saari; Birgitta Burger; Petri Toiviainen

Music preference has been related to individual differences like social identity, cognitive style, and personality, but quantifying music preference can be a challenge. Self-report measures may be too presumptive of shared genre definitions between listeners, while listener ratings of expert-selected music may fail to reflect typical listeners’ genre boundaries. The current study aims to address this by using a social-tagging approach to select music for studying preference. In this study, 2,407 tracks were collected and subsampled from the Last.fm social-tagging service and the EchoNest platform based on attributes such as genre, tempo, and danceability. The set was further subsampled according to tempo estimates and metadata from EchoNest, resulting in 48 excerpts from 12 genres. Participants (n = 210) heard and rated the excerpts, rated each genre using the Short Test of Music Preferences (STOMP; n.d.), and completed the Ten-Item Personality Index (TIPI), the Empathy Quotient (EQ) and the Systemizing Quotient (SQ). Mean preference ratings correlated significantly with STOMP scores, suggesting that social tagging can provide a fairly reliable link between perception and genre labels. Principal component analysis (PCA) of the ratings revealed 4 musical components: “Danceable,” “Jazzy,” “Hard,” and “Rebellious.” Component scores correlated modestly but significantly with TIPI, EQ, and SQ scores. These results support and expand previous findings linking personality and music preference, and provide support for a novel method of using crowd tagging in the study of music preference.


Psychomusicology: Music, Mind and Brain | 2017

Anxiety reduction with music and tempo synchronization on magnetic resonance imaging patients.

Zsuzsa Földes; Esa Ala-Ruona; Birgitta Burger; Gergely Orsi

Anxiety and claustrophobic reactions in MRI examinations cause unintentional movements, and such motion artifacts lead to interpretation problems. Furthermore, requested anesthesia makes the process costly. A total of 60 outpatients were examined in the Diagnostic Centre of Pécs, Hungary, to test whether synchronizing recorded music to the gradient pulsation of the MRI device can improve the sedative effect of the music. The patients were assigned to three groups: a nonmusic (control), an original tempo (random) and a synchronized music (synchronous) group. Results showed a significantly decreased state anxiety level after the MRI examination in the random and synchronous groups as compared with the control group. However, there was no difference in the effectiveness of either music conditions regarding state anxiety level after the examination. Participants in the music groups found the examination significantly more pleasant compared with the control group. In conclusion, the present study provides support for the notion that listening to music during an MRI examination significantly reduces patient anxiety, whereas noise attenuating devices do not provide the same effect.


10th Sound and Music Computing Conference, SMC 2013, Stockholm, Sweden | 2013

MOCAP TOOLBOX - A MATLAB TOOLBOX FOR COMPUTATIONAL ANALYSIS OF MOVEMENT DATA

Birgitta Burger; Petri Toiviainen


Journal of Research in Personality | 2010

Effects of the Big Five and musical genre on music-induced movement

Geoff Luck; Suvi Saarikallio; Birgitta Burger; Marc R. Thompson; Petri Toiviainen

Collaboration


Dive into the Birgitta Burger's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Geoff Luck

University of Jyväskylä

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Tommi Jantunen

University of Jyväskylä

View shared research outputs
Top Co-Authors

Avatar

Emily Carlson

University of Jyväskylä

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge