Laura Diosan
Institut national des sciences appliquées
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Laura Diosan.
computational intelligence for modelling, control and automation | 2005
Laura Diosan
The portfolio optimization problem is a well-known difficult problem occurring in financial real world. The problem consists in choosing an optimal set of assets in order to minimize the risk and maximize the profit of the investment. A multiobjective approach to this problem is suggested in this paper. We use three well-known evolutionary algorithms (namely NSGA2, PESA and SPEA2) for solving the bi-objective portfolio optimization problem. Several numerical experiments are performed using real-world data. The results show that PESA outperforms NSGA2 and SPEA2 for the considered test cases
Genetic Programming and Evolvable Machines | 2009
Laura Diosan; Mihai Oltean
Manual design of Evolutionary Algorithms (EAs) capable of performing very well on a wide range of problems is a difficult task. This is why we have to find other manners to construct algorithms that perform very well on some problems. One possibility (which is explored in this paper) is to let the evolution discover the optimal structure and parameters of the EA used for solving a specific problem. To this end a new model for automatic generation of EAs by evolutionary means is proposed here. The model is based on a simple Genetic Algorithm (GA). Every GA chromosome encodes an EA, which is used for solving a particular problem. Several Evolutionary Algorithms for function optimization are generated by using the considered model. Numerical experiments show that the EAs perform similarly and sometimes even better than standard approaches for several well-known benchmarking problems.
international conference on machine learning and applications | 2007
Laura Diosan; Alexandrina Rogozan; Jean Pierre Pecuchet
hybrid model for evolving support vector machine (SVM) kernel functions is developed in this paper. The kernel expression is considered as a parameter of the SVM algorithm and the current approach tries to find the best expression for this SVM parameter. The model is a hybrid technique that combines a genetic programming (GP) algorithm and a support vector machine (SVM) algorithm. Each GP chromosome is a tree encoding the mathematical expression for the kernel function. The evolved kernel is compared to several human-designed kernels and to a previous genetic kernel on several datasets. Numerical experiments show that the SVM embedding our evolved kernel performs statistically better than standard kernels, but also than previous genetic kernel for all considered classification problems.
european conference on evolutionary computation in combinatorial optimization | 2006
Laura Diosan; Mihai Oltean
A new model for evolving the structure of a Particle Swarm Optimization (PSO) algorithm is proposed in this paper. The model is a hybrid technique that combines a Genetic Algorithm (GA) and a PSO algorithm. Each GA chromosome is an array encoding a meaning for updating the particles of the PSO algorithm. The evolved PSO algorithm is compared to a human-designed PSO algorithm by using ten artificially constructed functions and one real-world problem. Numerical experiments show that the evolved PSO algorithm performs similarly and sometimes even better than standard approaches for the considered problems.
Applied Soft Computing | 2009
Mihai Oltean; Laura Diosan
The aim of this research is to develop an autonomous system for solving data analysis problems. The system, called Genetic Programming-Autonomous Solver (GP-AS) contains most of the features required by an autonomous software: it decides if it knows or not how to solve a particular problem, it can construct solutions for new problems, it can store the created solutions for later use, it can improve the existing solutions in the idle-time it can efficiently manage the computer resources for fast running speed and it can detect and handle failure cases. The generator of solutions for new problems is based on an adaptive variant of Genetic Programming. We have tested this part by solving some well-known problems in the field of symbolic regression and classification. Numerical experiments show that the GP-AS system is able to perform very well on the considered test problems being able to successfully compete with standard GP having manually set parameters.
genetic and evolutionary computation conference | 2007
Laura Diosan; Mihai Oltean
A new model for automatic generation of Evolutionary Algorithms (EAs) by evolutionary means is proposed in this paper. The model is based on a simple Genetic Algorithm (GA). Every GA chromosome encodes an EA, which is used for solving a particular problem. Several Evolutionary Algorithms for function optimization are evolved by using the considered model. Numerical experiments show that the evolved Evolutionary Algorithms perform similarly and sometimes even better than standard approaches for several well-known benchmarking problems.
european conference on genetic programming | 2006
Laura Diosan; Mihai Oltean
A new model for evolving crossover operators for evolutionary function optimization is proposed in this paper. The model is a hybrid technique that combines a Genetic Programming (GP) algorithm and a Genetic Algorithm (GA). Each GP chromosome is a tree encoding a crossover operator used for function optimization. The evolved crossover is embedded into a standard Genetic Algorithm which is used for solving a particular problem. Several crossover operators for function optimization are evolved using the considered model. The evolved crossover operators are compared to the human-designed convex crossover. Numerical experiments show that the evolved crossover operators perform similarly or sometimes even better than standard approaches for several well-known benchmarking problems.
genetic and evolutionary computation conference | 2007
Laura Diosan; Mihai Oltean; Alexandrina Rogozan; Jean Pierre Pecuchet
Classical kernel-based classifiers only use a single kernel, butthe real world applications have emphasized the need to con-sider a combination of kernels also known as a multiple kernel in order to boost the performance. Our purpose isto automatically find the mathematical expression of a multiple kernel by evolutionary means. In order to achieve this purpose we propose a hybrid model that combines a Genetic Programming (GP) algorithm and a kernel-based Support Vector Machine (SVM) classifier. Each GP chromosome isa tree encoding the mathematical expression of a multiple kernel. Numerical experiments show that the SVM embedding the evolved multiple kernel performs better than the standard kernels for the considered classification problems.
genetic and evolutionary computation conference | 2007
Oana Muntean; Laura Diosan; Mihai Oltean
The result of the program encoded into a Genetic Programming(GP) tree is usually returned by the root of that tree. However, this is not a general strategy. In this paper we present and investigate a new variant where the best subtree is chosen to provide the solution of the problem. The other nodes (not belonging to the best subtree) are deleted. This will reduce the size of the chromosome in those cases where its best subtree is different from the entire tree. We have tested this strategy on a wide range of regression and classification problems. Numerical experiments have shown that the proposed approach can improve both the search speed and the quality of results.
Journal of Artificial Evolution and Applications | 2008
Laura Diosan; Mihai Oltean
Evolutionary algorithms (EAs) can be used in order to design particle swarm optimization (PSO) algorithms that work, in some cases, considerably better than the human-designed ones. By analyzing the evolutionary process of designing PSO algorithms, we can identify different swarm phenomena (such as patterns or rules) that can give us deep insights about the swarm behavior. The rules that have been observed can help us design better PSO algorithms for optimization. We investigate and analyze swarm phenomena by looking into the process of evolving PSO algorithms. Several test problems have been analyzed in the experiments and interesting facts can be inferred from the strategy evolution process (the particle quality could influence the update order, some particles are updated more frequently than others, the initial swarm size is not always optimal).