Pascal Ballet
École nationale d'ingénieurs de Brest
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Pascal Ballet.
Pattern Recognition | 2004
Vincent Rodin; Abdesslam Benzinou; Anne Guillaud; Pascal Ballet; Fabrice Harrouet; Jacques Tisseau; J. Le Bihan
Abstract In this article, we present a parallel image processing system based on the concept of reactive agents. Our system lies in the oRis language, which allows to describe finely and simply the agents’ behaviors to detect image features. We also present a method of segmentation using a multi-agent system, and two biological applications made with oRis. The stopping of this multi-agent system is implemented through a technique issued from immunology: the apoptosis.
International Journal of Astrobiology | 2012
Nicolas Glade; Pascal Ballet; Olivier Bastien
The number N of detectable (i.e. communicating) extraterrestrial civilizations in the Milky Way galaxy is usually calculated by using the Drake equation. This equation was established in 1961 by Frank Drake and was the first step to quantifying the Search for ExtraTerrestrial Intelligence (SETI) field. Practically, this equation is rather a simple algebraic expression and its simplistic nature leaves it open to frequent re-expression. An additional problem of the Drake equation is the time-independence of its terms, which for example excludes the effects of the physico-chemical history of the galaxy. Recently, it has been demonstrated that the main shortcoming of the Drake equation is its lack of temporal structure, i.e., it fails to take into account various evolutionary processes. In particular, the Drake equation does not provides any error estimation about the measured quantity. Here, we propose a first treatment of these evolutionary aspects by constructing a simple stochastic process that will be able to provide both a temporal structure to the Drake equation (i.e. introduce time in the Drake formula in order to obtain something like N ( t )) and a first standard error measure.
Acta Biotheoretica | 2002
Patrick Amar; Pascal Ballet; Georgia Barlovatz-Meimon; Arndt Benecke; Gilles Bernot; Yves Bouligand; Paul Bourguine; Franck Delaplace; Jean-Marc Delosme; Maurice Demarty; Itzhak Fishov; Jean Fourmentin-Guilbert; Joe A. Fralick; Jean-Louis Giavitto; Bernard Gleyse; Christophe Godin; Roberto Incitti; François Képès; Catherine Lange; Loïs Le Sceller; Corinne Loutellier; Olivier Michel; Franck Molina; Chantal Monnier; René Natowicz; Vic Norris; Nicole Orange; Hélène Pollard; Derek Raine; Camille Ripoll
New concepts may prove necessary to profit from the avalanche of sequence data on the genome, transcriptome, proteome and interactome and to relate this information to cell physiology. Here, we focus on the concept of large activity-based structures, or hyperstructures, in which a variety of types of molecules are brought together to perform a function. We review the evidence for the existence of hyperstructures responsible for the initiation of DNA replication, the sequestration of newly replicated origins of replication, cell division and for metabolism. The processes responsible for hyperstructure formation include changes in enzyme affinities due to metabolite-induction, lipid-protein affinities, elevated local concentrations of proteins and their binding sites on DNA and RNA, and transertion. Experimental techniques exist that can be used to study hyperstructures and we review some of the ones less familiar to biologists. Finally, we speculate on how a variety of in silico approaches involving cellular automata and multi-agent systems could be combined to develop new concepts in the form of an Integrated cell (I-cell) which would undergo selection for growth and survival in a world of artificial microbiology.
systems man and cybernetics | 1997
Pascal Ballet; Jacques Tisseau; F. Harrouet
The immune system mechanisms are very complex and the number of parameters is extremely important. Moreover, the interactions between the different cells during an immune response induce chaos and nonlinear phenomena. Our approach consists in using the cooperative models, established by the immunologists, to build a multi-agent model. We avoid the problem of nondeterminism by only encoding the basic behaviors of the agents, and the global chaotic phenomena are induced by the interactions between the agents. The advantages are that agents can be viewed, modified, removed from the model or added to the model very easily. That is not the case in mathematics, where a modification of an assumption generally involves the rewriting of the model. This approach is possible because the behaviors of several immune system cells are known in their principal lines and some qualitative models of immune cell cooperations have been developed by immunologists. These models have already demonstrated that they are valid locally in time and in space, i.e. they do not have a global approach. With the multi-agent system, we can, thanks to the simulation, analyse the global consequences from the local behaviors and observe a qualitative striking resemblance to statistical results coming from a real experimentation. We have chosen to simulate a human secondary humoral response with a multi-agent system to study the kinetic of the antibody proliferation with several type of antigenic substances.
bioinformatics and bioengineering | 2009
Vincent Rodin; Gabriel Querrec; Pascal Ballet; François–Régis Bataille; Gireg Desmeulles; Jean François Abgrall; Jacques Tisseau
In order to simulate biological processes, we use multi-agents system. However, modelling cell behavior in systems biology is complex and may be based on intracellular biochemical pathway. So, we have developed in this study a Fuzzy Influence Graph to model MAPK pathway. A Fuzzy Influence Graph is also called Fuzzy Cognitive Map.This model can be integrated in agents representing cells. Results indicate that despite individual variations, the average behavior of MAPK pathway in a cells group is close to results obtained by ordinary differential equations. So, we have also modelled multiple myeloma cells signalling by using this approach.
Theory in Biosciences | 2011
Vic Norris; Abdallah Zemirline; Patrick Amar; Jean Nicolas Audinot; Pascal Ballet; Eshel Ben-Jacob; Gilles Bernot; Guillaume Beslon; Armelle Cabin; Eric Fanchon; Jean-Louis Giavitto; Nicolas Glade; Patrick Greussay; Yohann Grondin; James A. Foster; Guillaume Hutzler; Jürgen Jost; François Képès; Olivier Michel; Franck Molina; Jacqueline Signorini; Pasquale Stano; Alain R. Thierry
The relevance of biological materials and processes to computing—aliasbioputing—has been explored for decades. These materials include DNA, RNA and proteins, while the processes include transcription, translation, signal transduction and regulation. Recently, the use of bacteria themselves as living computers has been explored but this use generally falls within the classical paradigm of computing. Computer scientists, however, have a variety of problems to which they seek solutions, while microbiologists are having new insights into the problems bacteria are solving and how they are solving them. Here, we envisage that bacteria might be used for new sorts of computing. These could be based on the capacity of bacteria to grow, move and adapt to a myriad different fickle environments both as individuals and as populations of bacteria plus bacteriophage. New principles might be based on the way that bacteria explore phenotype space via hyperstructure dynamics and the fundamental nature of the cell cycle. This computing might even extend to developing a high level language appropriate to using populations of bacteria and bacteriophage. Here, we offer a speculative tour of what we term bactoputing, namely the use of the natural behaviour of bacteria for calculating.
systems man and cybernetics | 2000
Pascal Ballet; Jean François Abgrall; Vincent Rodin; Jacques Tisseau
Describes a simulation of the platelet agglutination in a damaged vein. This agglutination, called plasmatic coagulation, appears in the human body and its malfunction involves dramatic diseases like thrombosis or hemophilia. We designed an in-machina experimentation that is very difficult to do in-vitro. The first aim of this simulation is to verify one of the biological models of plasmatic coagulation and primary hemostasis. The second aim is to test different ways to regulate the thrombin production. Then, we simulate a haemophilia-like disease and one of its possible treatments.
Food and Chemical Toxicology | 2017
G. Chevillotte; A. Bernard; Clémence Varret; Pascal Ballet; Laurent Bodin; Alain-Claude Roudot
More and more studies aim to characterize non-monotonic dose response curves (NMDRCs). The greatest difficulty is to assess the statistical plausibility of NMDRCs from previously conducted dose response studies. This difficulty is linked to the fact that these studies present (i) few doses tested, (ii) a low sample size per dose, and (iii) the absence of any raw data. In this study, we propose a new methodological approach to probabilistically characterize NMDRCs. The methodology is composed of three main steps: (i) sampling from summary data to cover all the possibilities that may be presented by the responses measured by dose and to obtain a new raw database, (ii) statistical analysis of each sampled dose-response curve to characterize the slopes and their signs, and (iii) characterization of these dose-response curves according to the variation of the sign in the slope. This method allows characterizing all types of dose-response curves and can be applied both to continuous data and to discrete data. The aim of this study is to present the general principle of this probabilistic method which allows to assess the non-monotonic dose responses curves, and to present some results.
TPNC 2013, 2nd International Conference on the Theory and Practice of Natural Computing | 2013
Anne Jeannin-Girardon; Pascal Ballet; Vincent Rodin
In the context of tissue morphogenesis study, in silico simulations can be seen as experiments in a virtual lab bench. Such simulations can facilitate the comprehension of a system, the test of hypotheses or the incremental refining of a model and its parameters. In silico simulations must be efficient and provide the possibility to simulate large tissues, containing thousands of cells. We propose to study tissue morphogenesis at the cellular level using our virtual biomechanical cell model. This model is based on a mass/spring system and coupled to a multi-agent system. We validated the relevance of our model through a case study: a cell sorting. Moreover, we took advantage of the large parallelism offered by graphics processing units (GPU), which contain up to thousands of cores: we implemented our model with the OpenCL framework. We ran large scale simulations, with up to 106 of our virtual cells. We studied the performance of our system on a CPU Intel Core i7 860, and two GPUs: a NVidia GeForce GT440 and a Nvidia GeForce GTX 690. The absence of synchronization in our implementation allowed the full benefits of the parallelism of these hardwares.
Acta Biotheoretica | 2013
Anne Jeannin-Girardon; Pascal Ballet; Vincent Rodin
The first aim of simulation in virtual environment is to help biologists to have a better understanding of the simulated system. The cost of such simulation is significantly reduced compared to that of in vivo simulation. However, the inherent complexity of biological system makes it hard to simulate these systems on non-parallel architectures: models might be made of sub-models and take several scales into account; the number of simulated entities may be quite large. Today, graphics cards are used for general purpose computing which has been made easier thanks to frameworks like CUDA or OpenCL. Parallelization of models may however not be easy: parallel computer programing skills are often required; several hardware architectures may be used to execute models. In this paper, we present the software architecture we built in order to implement various models able to simulate multi-cellular system. This architecture is modular and it implements data structures adapted for graphics processing units architectures. It allows efficient simulation of biological mechanisms.