Nisrine Jrad
University of Technology of Troyes
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Nisrine Jrad.
Journal of Neural Engineering | 2011
Nisrine Jrad; Marco Congedo; Ronald Phlypo; Sandra Rousseau; Rémi Flamary; Florian Yger; Alain Rakotomamonjy
In many machine learning applications, like brain-computer interfaces (BCI), high-dimensional sensor array data are available. Sensor measurements are often highly correlated and signal-to-noise ratio is not homogeneously spread across sensors. Thus, collected data are highly variable and discrimination tasks are challenging. In this work, we focus on sensor weighting as an efficient tool to improve the classification procedure. We present an approach integrating sensor weighting in the classification framework. Sensor weights are considered as hyper-parameters to be learned by a support vector machine (SVM). The resulting sensor weighting SVM (sw-SVM) is designed to satisfy a margin criterion, that is, the generalization error. Experimental studies on two data sets are presented, a P300 data set and an error-related potential (ErrP) data set. For the P300 data set (BCI competition III), for which a large number of trials is available, the sw-SVM proves to perform equivalently with respect to the ensemble SVM strategy that won the competition. For the ErrP data set, for which a small number of trials are available, the sw-SVM shows superior performances as compared to three state-of-the art approaches. Results suggest that the sw-SVM promises to be useful in event-related potentials classification, even with a small number of training trials.
Neurocomputing | 2012
Nisrine Jrad; Marco Congedo
Classifying brain activities is a challenging task since Electroencephalography (EEG) recordings exhibit distinct and individualized spatial and temporal characteristics correlated with noise and various physical and mental activities. To increase classification accuracy, it is thus crucial to identify discriminant spatio-temporal features. This paper presents a method for analyzing the spatio-temporal characteristics associated with Event related Potentials (ERPs). First, a resampling procedure based on Global Field Power (GFP) extracts temporal features. Second, a spatially weighted SVM (sw-SVM) is used to learn a spatial filter optimizing the classification performance for each temporal feature. Third, the so-obtained ensemble of sw-SVM classifiers are combined using a weighted combination of all sw-SVM outputs. Results indicate that the inclusion of temporal features provides useful insight regarding classification performance and physiological understanding.
Computational and Mathematical Methods in Medicine | 2014
Rémi Flamary; Nisrine Jrad; Ronald Phlypo; Marco Congedo; Alain Rakotomamonjy
This work investigates the use of mixed-norm regularization for sensor selection in event-related potential (ERP) based brain-computer interfaces (BCI). The classification problem is cast as a discriminative optimization framework where sensor selection is induced through the use of mixed-norms. This framework is extended to the multitask learning situation where several similar classification tasks related to different subjects are learned simultaneously. In this case, multitask learning helps in leveraging data scarcity issue yielding to more robust classifiers. For this purpose, we have introduced a regularizer that induces both sensor selection and classifier similarities. The different regularization approaches are compared on three ERP datasets showing the interest of mixed-norm regularization in terms of sensor selection. The multitask approaches are evaluated when a small number of learning examples are available yielding to significant performance improvements especially for subjects performing poorly.
international conference of the ieee engineering in medicine and biology society | 2015
Nisrine Jrad; Amar Kachenoura; Isabelle Merlet; Anca Nica; Christian Bénar; Fabrice Wendling
High Frequency Oscillations (HFOs 40-500 Hz), recorded from intracerebral electroencephalography (iEEG) in epileptic patients, are categorized into four distinct sub-bands (Gamma, High-Gamma, Ripples and Fast Ripples). They have recently been used as a reliable biomarker of epileptogenic zones. The objective of this paper is to investigate the possibility of discriminating between the different classes of HFOs which physiological/pathological value is critical for diagnostic but remains to be clarified. The proposed method is based on the definition of a relevant feature vector built from energy ratios (computed using Wavelet Transform-WT) in a-priori-defined frequency bands. It makes use of a multiclass Linear Discriminant Analysis (LDA) and is applied to iEEG signals recorded in patients candidate to epilepsy surgery. Results obtained from bootstrap on training/test datasets indicate high performances in terms of sensitivity and specificity.
BioMed Research International | 2009
Nisrine Jrad; Edith Grall-Maës; Pierre Beauseroy
Supervised learning of microarray data is receiving much attention in recent years. Multiclass cancer diagnosis, based on selected gene profiles, are used as adjunct of clinical diagnosis. However, supervised diagnosis may hinder patient care, add expense or confound a result. To avoid this misleading, a multiclass cancer diagnosis with class-selective rejection is proposed. It rejects some patients from one, some, or all classes in order to ensure a higher reliability while reducing time and expense costs. Moreover, this classifier takes into account asymmetric penalties dependant on each class and on each wrong or partially correct decision. It is based on ν-1-SVM coupled with its regularization path and minimizes a general loss function defined in the class-selective rejection scheme. The state of art multiclass algorithms can be considered as a particular case of the proposed algorithm where the number of decisions is given by the classes and the loss function is defined by the Bayesian risk. Two experiments are carried out in the Bayesian and the class selective rejection frameworks. Five genes selected datasets are used to assess the performance of the proposed method. Results are discussed and accuracies are compared with those computed by the Naive Bayes, Nearest Neighbor, Linear Perceptron, Multilayer Perceptron, and Support Vector Machines classifiers.
international conference on pattern recognition | 2008
Nisrine Jrad; Edith Grall-Maës; Pierre Beauseroy
A procedure to select a supervised rule for multiclass problem from a labeled dataset is proposed. The rule allows class-selective rejection and performance constraints. The unknown probabilities are estimated with a Parzen estimator. A set of rules are built by varying the Parzen¿s smoothness parameter of the marginal probabilities estimates and plugging them into the statistical hypothesis rules. A criterion that assesses the quality of these rules is estimated and used to select a rule. Resampling and aggregation methods are used to show the efficiency of the estimated criterion.
2015 International Conference on Advances in Biomedical Engineering (ICABME) | 2015
Nisrine Jrad; Amar Kachenoura; Isabelle Merlet; Fabrice Wendling
Interictal High Frequency Oscillations (HFOs [30-600 Hz]) have been recently used as reliable biomarkers for epileptogenic zones. Intra- and inter-subject variations represent the main source of HFOs diversity in terms of spectral and temporal characteristics. According to spectral characteristics, HFOs are usually classified into four sub-bands: gamma, high-gamma, ripples and fast ripples. The objective of this paper is to investigate the possibility of discriminating HFOs classes. Gabor Transform is used to extract relevant features from intracerebral electroencephalography signals recorded in patients candidate to epilepsy surgery. A multiclass linear discriminant analysis is applied to discriminate the HFOs categories. Results obtained from bootstrap on training/test datasets show high performances in terms of sensitivity and specificity. Results are also compared to those obtained by Daubechies Wavelets.
IEEE Transactions on Biomedical Engineering | 2018
Marcelo A. Colominas; Mohamad El Sayed Hussein Jomaa; Nisrine Jrad; Anne Humeau-Heurtier; Patrick Van Bogaert
Objective: Our goal is to use existing and to propose new time–frequency entropy measures that objectively evaluate the improvement on epileptic patients after medication by studying their resting state electroencephalography (EEG) recordings. An increase in the complexity of the signals would confirm an improvement in the general state of the patient. Methods: We review the Rényi entropy based on time–frequency representations, along with its time-varying version. We also discuss the entropy based on singular value decomposition computed from a time–frequency representation, and introduce its corresponding time-dependant version. We test these quantities on synthetic data. Friedman tests are used to confirm the differences between signals (before and after proper medication). Principal component analysis is used for dimensional reduction prior to a simple threshold discrimination. Results: Experimental results show a consistent increase in complexity measures in the different regions of the brain. These findings suggest that extracted features can be used to monitor treatment. When combined, they are useful for classification purposes, with areas under ROC curves higher than 0.93 in some regions. Conclusion: Here we applied time–frequency complexity measures to resting state EEG signals from epileptic patients for the first time. We also introduced a new time-varying complexity measure. We showed that these features are able to evaluate the treatment of the patient, and to perform classification. Significance: The time–frequency complexities, and their time-varying versions, can be used to monitor the treatment of epileptic patients. They could be applied to a wider range of problems.
european signal processing conference | 2017
Nisrine Jrad; Amar Kachenoura; Anca Nica; Isabelle Merlet; Fabrice Wendling
Interictal High Frequency Oscillations, (HFOs [30600 Hz]), recorded from intracerebral electroencephalo-graphy (iEEG) in epileptic brain, showed to be potential biomarkers of epilepsy. Hence, their automatic detection has become a subject of high interest. So far, all detection algorithms consisted of comparing HFOs energy, computed in bands of interest, to a threshold. In this paper, a sequential technique was investigated. Detection was based on a variant of the Cumulative Sum (CUSUM) test, the so-called Page-Hinkley algorithm showing optimal results for detecting abrupt changes in the mean of a normal random signal. Experiments on simulated and real datasets showed the good performance of the method in terms of sensitivity and false detection rate. Compared to the classical thresholding, Page-Hinkley showed better performance.
Archive | 2010
Nisrine Jrad; Pierre Beauseroy; Edith Grall-Maës
The task of classification occurs in a wide range of human activity. The problem concerns learning a decision rule that allows to assign a pattern to a decision option on the basis of observed attributes or features. Contexts in which a classification task is fundamental include, sorting letters on the basis of machine-read postcodes, the preliminary diagnosis of a patient’s disease or the fraud currency and documents detection. In the classical framework, decision options are given by the pre-defined classes and a decision rule is designed by optimizing a given loss function, for instance the misclassification rate. In some cases, the loss function should be more general. First, for some applications, like face identification or cancer diagnosis, one may favor withholding decision instead of taking a wrong decision. In such cases, the introduction of rejection options should be considered in order to ensure a higher reliability Ha (1997); Horiuchi (1998); Jrad, Grall-Maes & Beauseroy (2008); Jrad et al. (2009d). Basic rejection consists of assigning a pattern to all classes which means that no decision is taken. More advanced rejection methods enable to assign a pattern ambiguously to a subset of classes. In this class-selective rejection scheme, decision options are given by the pre-defined classes as well as by defined subsets of different combinations among these classes. In order to define a decision rule, a general loss function can be defined by costs that penalize differently the wrong decisions and the ambiguous ones. Some applications may require to control the performance of the decision rule or more specifically, the performance measured by indicators related to the decision rule. These latter could be formulated as the performance constraints. Hence, the decision problem should also take into account these constraints. A general formulation of this problem was proposed in GrallMaes & Beauseroy (2009). The decision problem is formulated as an optimization problem with constraints. It was shown that the optimal rule can be obtained by optimizing its Lagrangian dual function which consists of finding the saddle point of this Lagrangian function. This optimal theoretical rule is applicable when the probability distributions are known. However, in many applications, only amounts of training set is available. Therefore, one should infer a classifier from a more or less limited set of training examples. In the classical decision framework, referred as the classical framework, many historical strands of research can be identified: statistical, Support Vector Machines, Neural Network Bishop (2006); Guobin & Lu (2007); Hao & Lin (2007); Husband & Lin (2002); Vapnik (1998); Yang et al. (2007)... In the class-selective rejection scheme, fewer works have been done Ha (1997); Horiuchi (1998). 1