Luca Turchet
Queen Mary University of London
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Luca Turchet.
tangible and embedded interaction | 2018
Sophie Skach; Anna Xambó; Luca Turchet; Ariane Stolfi; Rebecca Stewart; Mathieu Barthet
This paper presents initial steps towards the design of an embedded system for body-centric sonic performance. The proposed prototyping system allows performers to manipulate sounds through gestural interactions captured by textile wearable sensors. The e-textile sensor data control, in real-time, audio synthesis algorithms working with content from Audio Commons, a novel web-based ecosystem for re-purposing crowd-sourced audio. The system enables creative embodied music interactions by combining seamless physical e-textiles with web-based digital audio technologies.
nordic conference on human-computer interaction | 2018
Chiara Rossitto; Asreen Rostami; Jakob Tholander; Donald McMillan; Louise Barkhuus; Carlo Fischione; Luca Turchet
This paper presents a case study of a fully working prototype of the Sensus smart guitar. Eleven professional guitar players were interviewed after a prototype test session. The smartness of the guitar was perceived as enabling the integration of a range of equipment into a single device, and the proactive exploration of novel expressions. The results draw attention to the musicians sense-making of the smart qualities, and to the perceived impact on their artistic practices. The themes highlight how smartness was experienced in relation to the guitars agency and the skills it requires, the tension between explicit (e.g. playing a string) and implicit (e.g. keeping rhythm) body movements, and to performing and producing music. Understanding this felt sense of smartness is relevant to how contemporary HCI research conceptualizes mundane artefacts enhanced with smart technologies, and to how such discourse can inform related design issues.
Frontiers in ICT | 2018
Luca Turchet; Andrew McPherson; Mathieu Barthet
Smart musical instruments are a class of IoT devices for music making, which encompass embedded intelligence as well as wireless connectivity. In previous work, we established design requirements for a novel smart musical instrument, a smart cajon, following a user-centred approach. This paper describes the implementation and technical evaluation of the designed component of the smart cajon related to hit classification and repurposing. A conventional acoustic cajon was enhanced with sensors to classify position of the hit and the gesture that produced it. The instrument was equipped with five piezo pickups attached to the internal panels and a condenser microphone located inside. The developed sound engine leveraged digital signal processing, sensor fusion, and machine learning techniques to classify the position, dynamics, and timbre of each hit. The techniques were devised and implemented to achieve low latency between action and the electronically-generated sounds, as well as keep computational efficiency high. The system was tuned to classify two main cajon playing techniques at different locations and we conducted evaluations using over 2000 hits performed by two professional players. We first assessed the classification performance when training and testing data related to recordings from the same player. In this configuration, classification accuracies of 100% were obtained for hit detection and location. Accuracies of over 90% were obtained when classifying timbres produced by the two playing techniques. We then assessed the classifier in a cross-player configuration (training and testing were performed using recordings from different players). Results indicated that while hit location scales relatively well across different players, gesture identification requires that the involved classifiers are trained specifically for each musician.
audio mostly conference | 2017
Luca Turchet
This paper presents the Hyper-Mandolin, which consists of a conventional acoustic mandolin augmented with different types of sensors, a microphone, as well as real-time control of digital effects and sound generators during the performers act of playing. The placing of the added technology is conveniently located and is not a hindrance to the acoustic use of the instrument. A modular architecture is involved to connect various sensors interfaces to a central computing unit dedicated to the analog to digital conversion of the sensors data. Such an architecture allows for an easy interchange of the sensors interface layouts. The processing of audio and sensors data is accomplished by applications coded in Max/MSP and running on an external computer. The instrument can also be used as a controller for digital audio workstations. The interactive control of the sonic output is based on the extraction of features from both the data captured by sensors and the acoustic waveforms captured by the microphone. The development of this instrument was mainly motivated by the authors need to extend the sonic and interaction possibility of the acoustic mandolin when used in conjunction with conventional electronics for sound processing.
audio mostly conference | 2017
Luca Turchet; Michele Benincaso; Carlo Fischione
This paper presents some of the possibilities for interaction between performers, audiences, and their smart devices, offered by the novel family of musical instruments, the Smart Instruments. For this purpose, some implemented use cases are described, which involved a preliminary prototype of MIND Music Labs Sensus Smart Guitar, the first exemplar of Smart Instrument. Sensus consists of a guitar augmented with sensors, actuators, onboard processing, and wireless communication. Some of the novel interactions enabled by Sensus technology are presented, which are based on connectivity of the instrument to smart devices, virtual reality headsets, and the cloud.
Journal of The Audio Engineering Society | 2018
Luca Turchet; Andrew McPherson; Mathieu Barthet
IEEE Access | 2018
Luca Turchet; Carlo Fischione; Georg Essl; Damian Keller; Mathieu Barthet
IEEE Access | 2018
Luca Turchet
Applied Acoustics | 2018
Luca Turchet; Ivan Camponogara; Francesca Nardello; Paola Zamparo; Paola Cesari
Archive | 2016
Michele Benincaso; Agostino De Angelis; Carlo Fischione; Luca Turchet; Stefano Zambon