Afif Masmoudi
University of Sfax
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Afif Masmoudi.
Expert Systems With Applications | 2010
Lobna Bouchaala; Afif Masmoudi; Faiez Gargouri; Ahmed Rebai
Learning Bayesian Network structure from database is an NP-hard problem and still one of the most exciting challenges in machine learning. Most of the widely used heuristics search for the (locally) optimal graphs by defining a score metric and employs a search strategy to identify the network structure having the maximum score. In this work, we propose a new score (named implicit score) based on the Implicit inference framework that we proposed earlier. We then implemented this score within the K2 and MWST algorithms for network structure learning. Performance of the new score metric was evaluated on a benchmark database (ASIA Network) and a biomedical database of breast cancer in comparison with traditional score metrics BIC and BD Mutual Information. We show that implicit score yields improved performance over other scores when used with the MWST algorithm and have similar performance when implemented within K2 algorithm.
Journal of Theoretical Biology | 2008
Hanen Ben Hassen; Afif Masmoudi; Ahmed Rebai
We introduce here the concept of Implicit networks which provide, like Bayesian networks, a graphical modelling framework that encodes the joint probability distribution for a set of random variables within a directed acyclic graph. We show that Implicit networks, when used in conjunction with appropriate statistical techniques, are very attractive for their ability to understand and analyze biological data. Particularly, we consider here the use of Implicit networks for causal inference in biomolecular pathways. In such pathways, an Implicit network encodes dependencies among variables (proteins, genes), can be trained to learn causal relationships (regulation, interaction) between them and then used to predict the biological response given the status of some key proteins or genes in the network. We show that Implicit networks offer efficient methodologies for learning from observations without prior knowledge and thus provide a good alternative to classical inference in Bayesian networks when priors are missing. We illustrate our approach by an application to simulated data for a simplified signal transduction pathway of the epidermal growth factor receptor (EGFR) protein.
Communications in Statistics-theory and Methods | 2005
Adelhamid Hassairi; Afif Masmoudi; Célestin C. Kokonendji
Abstract In this article, we introduce a notion of implicit distribution for a parameter in a dominated statistical model. It is a conditional distribution of the parameter given the data. We show that the implicit distribution exists for the Jørgensen parameter in an exponential dispersion model. A method of point estimation is provided for some illustrative examples.
systems, man and cybernetics | 2002
H. Azaza; Afif Masmoudi
This paper deals with the control of a variable speed drive made up of a doubly-fed motor (DFM) and a rotor converter including two current-regulated voltage inverters separated by a DC link: the rotor-side inverter and the line-side one. Both inverters are controlled by vector control strategies using electrical variables expressed in the frame held by the DFM stator flux. High transient and steady-state performances of the DFM have been gained with the implementation of the stator flux oriented control in the rotor-side inverter, while a unity overall power factor operation has been achieved with the implementation of a (P,Q) vector control strategy in the line-side inverter.
Multimedia Tools and Applications | 2015
Atef Masmoudi; William Puech; Afif Masmoudi
In this paper, we propose a new approach for a block-based lossless image compression using finite mixture models and adaptive arithmetic coding. Conventional arithmetic encoders encode and decode images sample-by-sample in raster scan order. In addition, conventional arithmetic coding models provide the probability distribution for whole source symbols to be compressed or transmitted, including static and adaptive models. However, in the proposed scheme, an image is divided into non-overlapping blocks and then each block is encoded separately by using arithmetic coding. The proposed model provides a probability distribution for each block which is modeled by a mixture of non-parametric distributions by exploiting the high correlation between neighboring blocks. The Expectation-Maximization algorithm is used to find the maximum likelihood mixture parameters in order to maximize the arithmetic coding compression efficiency. The results of comparative experiments show that we provide significant improvements over the state-of-the-art lossless image compression standards and algorithms. In addition, experimental results show that the proposed compression algorithm beats JPEG-LS by 9.7 % when switching between pixel and prediction error domains.
Journal of Computational Biology | 2009
Hanen Ben Hassen; Afif Masmoudi; Ahmed Rebai
We summarize here the Implicit statistical inference approach as an alternative to Bayesian networks and we give an effective iterative algorithm analogous to the Expectation Maximization algorithm to infer signal transduction network when the set of data is incomplete. We proved the convergence of our algorithm that we called Implicit algorithm and we apply it to simulated data for a simplified signal transduction pathway of the EGFR protein.
Journal of Computational and Applied Mathematics | 2015
Heni Bouhamed; Afif Masmoudi; Thierry Lecroq; Ahmed Rebai
Currently, Bayesian Networks (BNs) have become one of the most complete, self-sustained and coherent formalisms used for knowledge acquisition, representation and application through computer systems. However, learning of BNs structures from data has been shown to be an NP-hard problem. It has turned out to be one of the most exciting challenges in machine learning. In this context, the present works major objective lies in setting up a further solution conceived to be a remedy for the intricate algorithmic complexity imposed during the learning of BN-structure with a massively-huge data backlog.
international conference on intelligent computing | 2012
Heni Bouhamed; Afif Masmoudi; Thierry Lecroq; Ahmed Rebai
It is a well-known fact that the Bayesian Networks’ (BNs) use as classifiers in different fields of application has recently witnessed a noticeable growth. Yet, the Naive Bayes’ application, and even the augmented Naive Bayes’, to classifier-structure learning, has been vulnerable to certain limits, which explains the practitioners’ resort to other more sophisticated types of algorithms. Consequently, the use of such algorithms has paved the way for raising the problem of super-exponential increase in computational complexity of the Bayesian classifier learning structure, with the increasing number of descriptive variables. In this context, the present work’s major objective lies in setting up a further solution whereby a remedy can be conceived for the intricate algorithmic complexity imposed during the learning of Bayesian classifiers’ structure with the use of sophisticated algorithms. Noteworthy, the present paper’s framework is organized as follows. We start, in the first place, by to propose a novel approach designed to reduce the algorithmic complexity without engendering any loss of information when learning the structure of a Bayesian classifier. We, then, go on to test our approach on a car diagnosis and a Lymphography diagnosis databases. Ultimately, an exposition of our conducted work’s interests will be a closing step to this work.
Neurocomputing | 2014
Aida Jarraya; Philippe Leray; Afif Masmoudi
Abstract Our work aims at developing or expliciting bridges between Bayesian networks (BNs) and Natural Exponential Families, by proposing discrete exponential Bayesian networks as a generalization of usual discrete ones. We introduce a family of prior distributions which generalizes the Dirichlet prior applied on discrete Bayesian networks, and then we determine the overall posterior distribution. Subsequently, we develop the Bayesian estimators of the parameters, and a new score function that extends the Bayesian Dirichlet score for BN structure learning. Our goal is to determine empirically in which contexts some of our discrete exponential BNs (Poisson deBNs) can be an effective alternative to usual BNs for density estimation.
international conference on tools with artificial intelligence | 2011
Aida Jarraya; Philippe Leray; Afif Masmoudi
In this paper, we develop the notion of discrete exponential Bayesian network, parametrization of Bayesian networks (BNs) using more general discrete quadratic exponential families instead of usual multinomial ones. We then introduce a family of prior distributions which generalizes the Dirichlet prior classically used with discrete Bayesian network. We develop the posterior distribution for our discrete exponential BNs leading to bayesian estimations of the parameters of our models and one new scoring function extending the Bayesian Dirichlet score used for structure learning. These theoretical results are finally illustrated for Poisson and Negative Binomial BNs.