Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pascale Kuntz is active.

Publication


Featured researches published by Pascale Kuntz.


Expert Systems With Applications | 2015

Learning from multi-label data with interactivity constraints

Noureddine-Yassine Nair-Benrekia; Pascale Kuntz; Frank Meyer

Extensive study of 12 multi-label learning methods with interactivity constraints.Focus on the beginning of the classification task where few examples are available.Experimental evaluation with a protocol independent of any implementation environment.Classifier performances are evaluated for 7 quality and time criteria on 12 datasets.RF-PCT obtains the best predictive performance while being computationally efficient. Interactive classification aims at introducing user preferences in the learning process to produce individualized outcomes more adapted to each users behavior than the fully automatic approaches. The current interactive classification systems generally adopt a single-label classification paradigm that constrains items to span one label at a time and consequently limit the users expressiveness while he/she interacts with data that are inherently multi-label. Moreover, the experimental evaluations are mainly subjective and closely depend on the targeted use cases and the interface characteristics. This paper presents the first extensive study of the impact of the interactivity constraints on the performances of a large set of twelve well-established multi-label learning methods. We restrict ourselves to the evaluation of the classifier predictive and time-computation performances while the number of training examples regularly increases and we focus on the beginning of the classification task where few examples are available. The classifier performances are evaluated with an experimental protocol independent of any implementation environment on a set of twelve multi-label benchmarks of various sizes from different domains. Our comparison shows that four classifiers can be distinguished for the prediction quality: RF-PCT (Random Forest of Predictive Clustering Trees, (Kocev, 2011)), EBR (Ensemble of Binary Relevance, (Read et al., 2011)), CLR (Calibrated Label Ranking, (Furnkranz et al., 2008)) and MLkNN (Multi-label kNN, (Zhang and Zhou, 2007)) with an advantage for the first two ensemble classifiers. Moreover, only RF-PCT competes with the fastest classifiers and is therefore considered as the most promising classifier for an interactive multi-label learning system.


pacific asia conference on knowledge discovery and data mining | 2017

Combining Dimensionality Reduction with Random Forests for Multi-label Classification Under Interactivity Constraints

Noureddine-Yassine Nair-Benrekia; Pascale Kuntz; Frank Meyer

Learning from multi-label data in an interactive framework is a challenging problem as algorithms must withstand some additional constraints: in particular, learning from few training examples in a limited time. A recent study of multi-label classifier behaviors in this context has identified the potential of the ensemble method “Random Forest of Predictive Clustering Trees” (RF-PCT). However, RF-PCT has shown a degraded performance in terms of computation time for large feature spaces. To overcome this limit, this paper proposes a new hybrid multi-label learning approach IDSR-RF (Independent Dual Space Reduction with RF-PCT) which first reduces the data dimension and then learns a predictive regression model in the reduced spaces with RF-PCT. The feature and the label spaces are independently reduced using the fast matrix factorization algorithm Gravity. The experimental results on nine high-dimensional datasets show that IDSR-RF significantly reduces the computation time without deteriorating the learning performances. To the best of our knowledge, it is currently the most promising learning approach for an interactive multi-label learning system.


international conference industrial, engineering & other applications applied intelligent systems | 2017

Supervised Feature Space Reduction for Multi-Label Nearest Neighbors

Wissam Siblini; Reda Alami; Frank Meyer; Pascale Kuntz

With the ability to process many real-world problems, multi-label classification has received a large attention in recent years and the instance-based ML-kNN classifier is today considered as one of the most efficient. But it is sensitive to noisy and redundant features and its performances decrease with increasing data dimensionality. To overcome these problems, dimensionality reduction is an alternative but current methods optimize reduction objectives which ignore the impact on the ML-kNN classification. We here propose ML-ARP, a novel dimensionality reduction algorithm which, using a variable neighborhood search metaheuristic, learns a linear projection of the feature space which specifically optimizes the ML-kNN classification loss. Numerical comparisons have confirmed that ML-ARP outperforms ML-kNN without data processing and four standard multi-label dimensionality reduction algorithms.


Studies in Classification, Data Analysis, and Knowledge Organization | 2015

Selecting a Multi-Label Classification Method for an Interactive System

Noureddine-Yassine Nair-Benrekia; Pascale Kuntz; Frank Meyer

Interactive classification-based systems engage users to coach learning algorithms to take into account their own individual preferences. However most of the recent interactive systems limit the users to a single-label classification, which may be not expressive enough in some organization tasks such as film classification, where a multi-label scheme is required. The objective of this paper is to compare the behaviors of 12 multi-label classification methods in an interactive framework where “good” predictions must be produced in a very short time from a very small set of multi-label training examples. Experimentations highlight important performance differences for four complementary evaluation measures (Log-Loss, Ranking-Loss, Learning and Prediction Times). The best results are obtained for Multi-label k Nearest Neighbors (ML-kNN), ensemble of classifier chains (ECC), and ensemble of binary relevance (EBR).


international conference on machine learning | 2018

CRAFTML, an Efficient Clustering-based Random Forest for Extreme Multi-label Learning

Wissam Siblini; Frank Meyer; Pascale Kuntz


international conference on machine learning | 2018

Clustering-based Random Forest of Predictive Trees for Extreme Multi-label Learning

Wissam Siblini; Frank Meyer; Pascale Kuntz


annual conference on computers | 2017

VIPE: A new interactive classification framework for large sets of short texts - application to opinion mining.

Wissam Siblini; Frank Meyer; Pascale Kuntz


Extraction et Gestion des Connaissances. EGC 2017 | 2017

VIPE : un outil interactif de classification multilabel de messages courts.

Frank Meyer; Sylvie Tricot; Pascale Kuntz; Wissam Siblini


Conférence AAFD & SFC 2016 | 2016

Vers un apprentissage multi-label rapide en grande dimension – Une étude préliminaire

Wissam Siblini; Pascale Kuntz; Frank Meyer


Extraction et Gestion des Connaissances | 2014

Sélection d'une méthode de classification multi-label pour un système interactif

Noureddine-Yassine Nair-Benrekia; Pascale Kuntz; Frank Meyer

Collaboration


Dive into the Pascale Kuntz's collaboration.

Researchain Logo
Decentralizing Knowledge