Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Asad Malik is active.

Publication


Featured researches published by Asad Malik.


Neuroscience Letters | 2014

Neural correlates of emotional responses to music: An EEG study

Ian Daly; Asad Malik; Faustina Hwang; Etienne B. Roesch; James Weaver; Alexis Kirke; Duncan Williams; Eduardo Reck Miranda; Slawomir J. Nasuto

This paper presents an EEG study into the neural correlates of music-induced emotions. We presented participants with a large dataset containing musical pieces in different styles, and asked them to report on their induced emotional responses. We found neural correlates of music-induced emotion in a number of frequencies over the pre-frontal cortex. Additionally, we found a set of patterns of functional connectivity, defined by inter-channel coherence measures, to be significantly different between groups of music-induced emotional responses.


Brain and Cognition | 2015

Music-induced emotions can be predicted from a combination of brain activity and acoustic features.

Ian Daly; Duncan Williams; James Hallowell; Faustina Hwang; Alexis Kirke; Asad Malik; James Weaver; Eduardo Reck Miranda; Slawomir J. Nasuto

It is widely acknowledged that music can communicate and induce a wide range of emotions in the listener. However, music is a highly-complex audio signal composed of a wide range of complex time- and frequency-varying components. Additionally, music-induced emotions are known to differ greatly between listeners. Therefore, it is not immediately clear what emotions will be induced in a given individual by a piece of music. We attempt to predict the music-induced emotional response in a listener by measuring the activity in the listeners electroencephalogram (EEG). We combine these measures with acoustic descriptors of the music, an approach that allows us to consider music as a complex set of time-varying acoustic features, independently of any specific music theory. Regression models are found which allow us to predict the music-induced emotions of our participants with a correlation between the actual and predicted responses of up to r=0.234,p<0.001. This regression fit suggests that over 20% of the variance of the participants music induced emotions can be predicted by their neural activity and the properties of the music. Given the large amount of noise, non-stationarity, and non-linearity in both EEG and music, this is an encouraging result. Additionally, the combination of measures of brain activity and acoustic features describing the music played to our participants allows us to predict music-induced emotions with significantly higher accuracies than either feature type alone (p<0.01).


Journal of Neural Engineering | 2016

Affective brain–computer music interfacing

Ian Daly; Duncan Williams; Alexis Kirke; James Weaver; Asad Malik; Faustina Hwang; Eduardo Reck Miranda; Slawomir J. Nasuto

OBJECTIVE We aim to develop and evaluate an affective brain-computer music interface (aBCMI) for modulating the affective states of its users. APPROACH An aBCMI is constructed to detect a users current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithmic music composition system and a case-based reasoning system. The system is trained and tested in a longitudinal study on a population of eight healthy participants, with each participant returning for multiple sessions. MAIN RESULTS The final online aBCMI is able to detect its users current affective states with classification accuracies of up to 65% (3 class, [Formula: see text]) and modulate its users affective states significantly above chance level [Formula: see text]. SIGNIFICANCE Our system represents one of the first demonstrations of an online aBCMI that is able to accurately detect and respond to users affective states. Possible applications include use in music therapy and entertainment.


international conference of the ieee engineering in medicine and biology society | 2014

Changes in music tempo entrain movement related brain activity.

Ian Daly; James Hallowell; Faustina Hwang; Alexis Kirke; Asad Malik; Etienne B. Roesch; James Weaver; Duncan Williams; Eduardo Reck Miranda; Slawomir J. Nasuto

The neural mechanisms of music listening and appreciation are not yet completely understood. Based on the apparent relationship between the beats per minute (tempo) of music and the desire to move (for example feet tapping) induced while listening to that music it is hypothesised that musical tempo may evoke movement related activity in the brain. Participants are instructed to listen, without moving, to a large range of musical pieces spanning a range of styles and tempos during an electroencephalogram (EEG) experiment. Event-related desynchronisation (ERD) in the EEG is observed to correlate significantly with the variance of the tempo of the musical stimuli. This suggests that the dynamics of the beat of the music may induce movement related brain activity in the motor cortex. Furthermore, significant correlations are observed between EEG activity in the alpha band over the motor cortex and the bandpower of the music in the same frequency band over time. This relationship is observed to correlate with the strength of the ERD, suggesting entrainment of motor cortical activity relates to increased ERD strength.


Brain-Computer Interfaces | 2014

Investigating music tempo as a feedback mechanism for closed-loop BCI control

Ian Daly; Duncan Williams; Faustina Hwang; Alexis Kirke; Asad Malik; Etienne B. Roesch; James Weaver; Eduardo Reck Miranda; Slawomir J. Nasuto

The feedback mechanism used in a brain-computer interface (BCI) forms an integral part of the closed-loop learning process required for successful operation of a BCI. However, ultimate success of the BCI may be dependent upon the modality of the feedback used. This study explores the use of music tempo as a feedback mechanism in BCI and compares it to the more commonly used visual feedback mechanism. Three different feedback modalities are compared for a kinaesthetic motor imagery BCI: visual, auditory via music tempo, and a combined visual and auditory feedback modality. Visual feedback is provided via the position, on the y-axis, of a moving ball. In the music feedback condition, the tempo of a piece of continuously generated music is dynamically adjusted via a novel music-generation method. All the feedback mechanisms allowed users to learn to control the BCI. However, users were not able to maintain as stable control with the music tempo feedback condition as they could in the visual feedback and combined conditions. Additionally, the combined condition exhibited significantly less inter-user variability, suggesting that multi-modal feedback may lead to more robust results. Finally, common spatial patterns are used to identify participant-specific spatial filters for each of the feedback modalities. The mean optimal spatial filter obtained for the music feedback condition is observed to be more diffuse and weaker than the mean spatial filters obtained for the visual and combined feedback conditions.


tests and proofs | 2015

Investigating Perceived Emotional Correlates of Rhythmic Density in Algorithmic Music Composition

Duncan Williams; Alexis Kirke; Eduardo Reck Miranda; Ian Daly; James Hallowell; James Weaver; Asad Malik; Etienne B. Roesch; Faustina Hwang; Slawomir J. Nasuto

Affective algorithmic composition is a growing field that combines perceptually motivated affective computing strategies with novel music generation. This article presents work toward the development of one application. The long-term goal is to develop a responsive and adaptive system for inducing affect that is both controlled and validated by biophysical measures. Literature documenting perceptual responses to music identifies a variety of musical features and possible affective correlations, but perceptual evaluations of these musical features for the purposes of inclusion in a music generation system are not readily available. A discrete feature, rhythmic density (a function of note duration in each musical bar, regardless of tempo), was selected because it was shown to be well-correlated with affective responses in existing literature. A prototype system was then designed to produce controlled degrees of variation in rhythmic density via a transformative algorithm. A two-stage perceptual evaluation of a stimulus set created by this prototype was then undertaken. First, listener responses from a pairwise scaling experiment were analyzed via Multidimensional Scaling Analysis (MDS). The statistical best-fit solution was rotated such that stimuli with the largest range of variation were placed across the horizontal plane in two dimensions. In this orientation, stimuli with deliberate variation in rhythmic density appeared farther from the source material used to generate them than from stimuli generated by random permutation. Second, the same stimulus set was then evaluated according to the order suggested in the rotated two-dimensional solution in a verbal elicitation experiment. A Verbal Protocol Analysis (VPA) found that listener perception of the stimulus set varied in at least two commonly understood emotional descriptors, which might be considered affective correlates of rhythmic density. Thus, these results further corroborate previous studies wherein musical parameters are monitored for changes in emotional expression and that some similarly parameterized control of perceived emotional content in an affective algorithmic composition system can be achieved and provide a methodology for evaluating and including further possible musical features in such a system. Some suggestions regarding the test procedure and analysis techniques are also documented here.


computer science and electronic engineering conference | 2015

Towards human-computer music interaction: Evaluation of an affectively-driven music generator via galvanic skin response measures

Ian Daly; Asad Malik; James Weaver; Faustina Hwang; Slawomir J. Nasuto; Duncan Williams; Alexis Kirke; Eduardo Reck Miranda

An affectively driven music generation system is described and evaluated. The system is developed for the intended eventual use in human-computer interaction systems such as brain-computer music interfaces. It is evaluated for its ability to induce changes in a listeners affective state. The affectively-driven algorithmic composition system was used to generate a stimulus set covering 9 discrete sectors of a 2-dimensional affective space by means of a 16 channel feedforward artificial neural network. This system was used to generate 90 short pieces of music with specific affective intentions, 10 stimuli for each of the 9 sectors in the affective space. These pieces were played to 20 healthy participants, and it was observed that the music generation system induced the intended affective states in the participants. This is further verified by inspecting the galvanic skin response recorded from participants.


affective computing and intelligent interaction | 2015

Identifying music-induced emotions from EEG for use in brain-computer music interfacing

Ian Daly; Asad Malik; James Weaver; Faustina Hwang; Slawomir J. Nasuto; Duncan Williams; Alexis Kirke; Eduardo Reck Miranda

Brain-computer music interfaces (BCMI) provide a method to modulate an individuals affective state via the selection or generation of music according to their current affective state. Potential applications of such systems may include entertainment of therapeutic applications. We outline a proposed design for such a BCMI and seek a method for automatically differentiating different music induced affective states. Band-power features are explored for use in automatically identifying music-induced affective states. Additionally, a linear discriminant analysis classifier and a support vector machine are evaluated with respect to their ability to classify music induced affective states from the electroencephalogram recorded during a BCMI calibration task. Accuracies of up to 79.5% (p <; 0.001) are achieved with the support vector machine.


Journal of Neuroscience Methods | 2015

Automated identification of neural correlates of continuous variables

Ian Daly; Faustina Hwang; Alexis Kirke; Asad Malik; James Weaver; Duncan Williams; Eduardo Reck Miranda; Slawomir J. Nasuto

BACKGROUND The electroencephalogram (EEG) may be described by a large number of different feature types and automated feature selection methods are needed in order to reliably identify features which correlate with continuous independent variables. NEW METHOD A method is presented for the automated identification of features that differentiate two or more groups in neurological datasets based upon a spectral decomposition of the feature set. Furthermore, the method is able to identify features that relate to continuous independent variables. RESULTS The proposed method is first evaluated on synthetic EEG datasets and observed to reliably identify the correct features. The method is then applied to EEG recorded during a music listening task and is observed to automatically identify neural correlates of music tempo changes similar to neural correlates identified in a previous study. Finally, the method is applied to identify neural correlates of music-induced affective states. The identified neural correlates reside primarily over the frontal cortex and are consistent with widely reported neural correlates of emotions. COMPARISON WITH EXISTING METHODS The proposed method is compared to the state-of-the-art methods of canonical correlation analysis and common spatial patterns, in order to identify features differentiating synthetic event-related potentials of different amplitudes and is observed to exhibit greater performance as the number of unique groups in the dataset increases. CONCLUSIONS The proposed method is able to identify neural correlates of continuous variables in EEG datasets and is shown to outperform canonical correlation analysis and common spatial patterns.


Frontiers in Human Neuroscience | 2017

Directed motor-auditory EEG connectivity is modulated by music tempo

Nicoletta Nicolaou; Asad Malik; Ian Daly; James Weaver; Faustina Hwang; Alexis Kirke; Etienne B. Roesch; Duncan Williams; Eduardo Reck Miranda; Slawomir J. Nasuto

Beat perception is fundamental to how we experience music, and yet the mechanism behind this spontaneous building of the internal beat representation is largely unknown. Existing findings support links between the tempo (speed) of the beat and enhancement of electroencephalogram (EEG) activity at tempo-related frequencies, but there are no studies looking at how tempo may affect the underlying long-range interactions between EEG activity at different electrodes. The present study investigates these long-range interactions using EEG activity recorded from 21 volunteers listening to music stimuli played at 4 different tempi (50, 100, 150 and 200 beats per minute). The music stimuli consisted of piano excerpts designed to convey the emotion of “peacefulness”. Noise stimuli with an identical acoustic content to the music excerpts were also presented for comparison purposes. The brain activity interactions were characterized with the imaginary part of coherence (iCOH) in the frequency range 1.5–18 Hz (δ, θ, α and lower β) between all pairs of EEG electrodes for the four tempi and the music/noise conditions, as well as a baseline resting state (RS) condition obtained at the start of the experimental task. Our findings can be summarized as follows: (a) there was an ongoing long-range interaction in the RS engaging fronto-posterior areas; (b) this interaction was maintained in both music and noise, but its strength and directionality were modulated as a result of acoustic stimulation; (c) the topological patterns of iCOH were similar for music, noise and RS, however statistically significant differences in strength and direction of iCOH were identified; and (d) tempo had an effect on the direction and strength of motor-auditory interactions. Our findings are in line with existing literature and illustrate a part of the mechanism by which musical stimuli with different tempi can entrain changes in cortical activity.

Collaboration


Dive into the Asad Malik's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alexis Kirke

Plymouth State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ian Daly

University of Reading

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge