Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Bruno Lepri is active.

Publication


Featured researches published by Bruno Lepri.


international conference on multimodal interfaces | 2008

Multimodal recognition of personality traits in social interactions

Fabio Pianesi; Nadia Mana; Alessandro Cappelletti; Bruno Lepri; Massimo Zancanaro

This paper targets the automatic detection of personality traits in a meeting environment by means of audio and visual features; information about the relational context is captured by means of acoustic features designed to that purpose. Two personality traits are considered: Extraversion (from the Big Five) and the Locus of Control. The classification task is applied to thin slices of behaviour, in the form of 1-minute sequences. SVM were used to test the performances of several training and testing instance setups, including a restricted set of audio features obtained through feature selection. The outcomes improve considerably over existing results, provide evidence about the feasibility of the multimodal analysis of personality, the role of social context, and pave the way to further studies addressing different features setups and/or targeting different personality traits.


international conference on multimodal interfaces | 2006

Automatic detection of group functional roles in face to face interactions

Massimo Zancanaro; Bruno Lepri; Fabio Pianesi

In this paper, we discuss a machine learning approach to automatically detect functional roles played by participants in a face to face interaction. We shortly introduce the coding scheme we used to classify the roles of the group members and the corpus we collected to assess the coding scheme reliability as well as to train statistical systems for automatic recognition of roles. We then discuss a machine learning approach based on multi-class SVM to automatically detect such roles by employing simple features of the visual and acoustical scene. The effectiveness of the classification is better than the chosen baselines and although the results are not yet good enough for a real application, they demonstrate the feasibility of the task of detecting group functional roles in face to face interactions.


international conference on multimodal interfaces | 2007

Using the influence model to recognize functional roles in meetings

Wen Dong; Bruno Lepri; Alessandro Cappelletti; Alex Pentland; Fabio Pianesi; Massimo Zancanaro

In this paper, an influence model is used to recognize functional roles played during meetings. Previous works on the same corpus demonstrated a high recognition accuracy using SVMs with RBF kernels. In this paper, we discuss the problems of that approach, mainly over-fitting, the curse of dimensionality and the inability to generalize to different group configurations. We present results obtained with an influence modeling method that avoid these problems and ensures both greater robustness and generalization capability.


mobile and ubiquitous multimedia | 2011

Modeling the co-evolution of behaviors and social relationships using mobile phone data

Wen Dong; Bruno Lepri; Alex Pentland

The co-evolution of social relationships and individual behavior in time and space has important implications, but is poorly understood because of the difficulty closely tracking the everyday life of a complete community. We offer evidence that relationships and behavior co-evolve in a student dormitory, based on monthly surveys and location tracking through resident cellular phones over a period of nine months. We demonstrate that a Markov jump process could capture the co-evolution in terms of the rates at which residents visit places and friends.


acm multimedia | 2014

Daily Stress Recognition from Mobile Phone Data, Weather Conditions and Individual Traits

Andrey Bogomolov; Bruno Lepri; Michela Ferron; Fabio Pianesi; Alex Pentland

Research has proven that stress reduces quality of life and causes many diseases. For this reason, several researchers devised stress detection systems based on physiological parameters. However, these systems require that obtrusive sensors are continuously carried by the user. In our paper, we propose an alternative approach providing evidence that daily stress can be reliably recognized based on behavioral metrics, derived from the users mobile phone activity and from additional indicators, such as the weather conditions (data pertaining to transitory properties of the environment) and the personality traits (data concerning permanent dispositions of individuals). Our multifactorial statistical model, which is person-independent, obtains the accuracy score of 72.28% for a 2-class daily stress recognition problem. The model is efficient to implement for most of multimedia applications due to highly reduced low-dimensional feature space (32d). Moreover, we identify and discuss the indicators which have strong predictive power.


ubiquitous computing | 2008

Multimodal support to group dynamics

Fabio Pianesi; Massimo Zancanaro; Elena Not; Chiara Leonardi; Vera Falcon; Bruno Lepri

The complexity of group dynamics occurring in small group interactions often hinders the performance of teams. The availability of rich multimodal information about what is going on during the meeting makes it possible to explore the possibility of providing support to dysfunctional teams from facilitation to training sessions addressing both the individuals and the group as a whole. A necessary step in this direction is that of capturing and understanding group dynamics. In this paper, we discuss a particular scenario, in which meeting participants receive multimedia feedback on their relational behaviour, as a first step towards increasing self-awareness. We describe the background and the motivation for a coding scheme for annotating meeting recordings partially inspired by the Bales’ Interaction Process Analysis. This coding scheme was aimed at identifying suitable observable behavioural sequences. The study is complemented with an experimental investigation on the acceptability of such a service.


international conference on multimodal interfaces | 2011

Please, tell me about yourself: automatic personality assessment using short self-presentations

Ligia Maria Batrinca; Nadia Mana; Bruno Lepri; Fabio Pianesi; Nicu Sebe

Personality plays an important role in the way people manage the images they convey in self-presentations and employment interviews, trying to affect the others first impressions and increase effectiveness. This paper addresses the automatically detection of the Big Five personality traits from short (30-120 seconds) self-presentations, by investigating the effectiveness of 29 simple acoustic and visual non-verbal features. Our results show that Conscientiousness and Emotional Stability/Neuroticism are the best recognizable traits. The lower accuracy levels for Extraversion and Agreeableness are explained through the interaction between situational characteristics and the differential activation of the behavioral dispositions underlying those traits.


language resources and evaluation | 2007

A multimodal annotated corpus of consensus decision making meetings

Fabio Pianesi; Massimo Zancanaro; Bruno Lepri; Alessandro Cappelletti

In this paper we present an annotated audio–video corpus of multi-party meetings. The multimodal corpus provides for each subject involved in the experimental sessions six annotation dimensions referring to group dynamics; speech activity and body activity. The corpus is based on 11 audio and video recorded sessions which took place in a lab setting appropriately equipped with cameras and microphones. Our main concern in collecting this multimodal corpus was to explore the possibility of providing feedback services to facilitate group processes and to enhance self awareness among small groups engaged in meetings. We therefore introduce a coding scheme for annotating relevant functional roles that appear in a small group interaction. We also discuss the reliability of the coding scheme and we present the first results for automatic classification.


acm multimedia | 2014

Automatic Personality and Interaction Style Recognition from Facebook Profile Pictures

Fabio Celli; Elia Bruni; Bruno Lepri

In this paper, we address the issue of personality and interaction style recognition from profile pictures in Facebook. We recruited volunteers among Facebook users and collected a dataset of profile pictures, labeled with gold standard self-assessed personality and interaction style labels. Then, we exploited a bag-of-visual-words technique to extract features from pictures. Finally, different machine learning approaches were used to test the effectiveness of these features in predicting personality and interaction style traits. Our good results show that this task is very promising, because profile pictures convey a lot of information about a user and are directly connected to impression formation and identity management.


IEEE Transactions on Affective Computing | 2012

Connecting Meeting Behavior with Extraversion—A Systematic Study

Bruno Lepri; Ramanathan Subramanian; Kyriaki Kalimeri; Jacopo Staiano; Fabio Pianesi; Nicu Sebe

This work investigates the suitability of medium-grained meeting behaviors, namely, speaking time and social attention, for automatic classification of the Extraversion personality trait. Experimental results confirm that these behaviors are indeed effective for the automatic detection of Extraversion. The main findings of our study are that: 1) Speaking time and (some forms of) social gaze are effective indicators of Extraversion, 2) classification accuracy is affected by the amount of time for which meeting behavior is observed, 3) independently considering only the attention received by the target from peers is insufficient, and 4) distribution of social attention of peers plays a crucial role.

Collaboration


Dive into the Bruno Lepri's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alex Pentland

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nadia Mana

fondazione bruno kessler

View shared research outputs
Researchain Logo
Decentralizing Knowledge