Carlos Magno Couto Jacinto
Petrobras
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Carlos Magno Couto Jacinto.
Expert Systems With Applications | 2013
Ronay Ak; Yan-Fu Li; Valeria Vitelli; Enrico Zio; Enrique López Droguett; Carlos Magno Couto Jacinto
Scale deposition can damage equipment in the oil & gas production industry. Hence, the reliable and accurate prediction of the scale deposition rate is critical for production availability. In this study, we consider the problem of predicting the scale deposition rate, providing an indication of the associated prediction uncertainty. We tackle the problem using an empirical modeling approach, based on experimental data. Specifically, we implement a multi-objective genetic algorithm (namely, non-dominated sorting genetic algorithm-II (NSGA-II)) to train a neural network (NN) (i.e. to find its parameters, that is its weights and biases) to provide the prediction intervals (PIs) of the scale deposition rate. The PIs are optimized both in terms of accuracy (coverage probability) and dimension (width). We perform k-fold cross-validation to guide the choice of the NN structure (i.e. the number of hidden neurons). We use hypervolume indicator metric to evaluate the Pareto fronts in the validation step. A case study is considered, with regards to a set of experimental observations: the NSGA-II-trained neural network is shown capable of providing PIs with both high coverage and small width.
winter simulation conference | 2005
D.K. Coelho; Mauro Roisenberg; P.Jd.F. Filho; Carlos Magno Couto Jacinto
This paper intends to show how two different methodologies, a Monte Carlo simulation method and a connectionist approach can be used to estimate the total time assessment in drilling and completion operations of oil wells in deep waters. The former approach performs a Monte Carlo simulation based on data from field operations. In the later one, correlations and regularities in parameters selected from a petroleum company database were detected using a competitive neural network, and then, a feedforward neural network was trained to estimate the average, standard deviation and total time wasted in the accomplishment of the well. At the end, the results obtained by both models are compared. The analyst could evaluate the precision of the estimated total-time based on geometric and technological parameters provided by the neural network tool, with those supplied by the traditional Monte Carlo method based on data of the drilling and completion operations.
Simulation Modelling Practice and Theory | 2008
Enrique López Droguett; Carlos Magno Couto Jacinto; Manoel Feliciano da Silva
Abstract Compelled by increasing oil prices, a research effort is underway for designing and implementing intelligent oil fields in Brazil, with a first pilot directed towards mature wells in the Northeast. One of the major benefits of this technology is the anticipation of oil production volumes and an improved reservoir management and control. Given the considerable steep investment on the new technology, availability is a key attribute: higher availability means higher production volumes. An important part of this effort is the development of pressure–temperature optical monitoring systems (OMS) and their availability assessment. Availability analysis of an OMS impose some complexities, where the most relevant aspects are: (i) the system is under a deteriorating process; (ii) the available time to complete the maintenance; and (iii) human error probability (HEP) during maintenance that is influenced by the available time and other factors (e.g., experience, fatigue) in returning an OMS to its normal operational condition. In this paper we present a first attempt to solve this problem. It is developed an availability assessment model in which the system dynamics is described via a continuous-time semi-Markovian process specified in terms of probabilities. This model is integrated with a Bayesian belief network characterizing the cause-effect relationships among factors influencing the repairman error probability during maintenance. The model is applied to a real case concerning mature oil wells.
Reliability Engineering & System Safety | 2015
Isis Didier Lins; Enrique López Droguett; Enrico Zio; Carlos Magno Couto Jacinto
Data-driven learning methods for predicting the evolution of the degradation processes affecting equipment are becoming increasingly attractive in reliability and prognostics applications. Among these, we consider here Support Vector Regression (SVR), which has provided promising results in various applications. Nevertheless, the predictions provided by SVR are point estimates whereas in order to take better informed decisions, an uncertainty assessment should be also carried out. For this, we apply bootstrap to SVR so as to obtain confidence and prediction intervals, without having to make any assumption about probability distributions and with good performance even when only a small data set is available. The bootstrapped SVR is first verified on Monte Carlo experiments and then is applied to a real case study concerning the prediction of degradation of a component from the offshore oil industry. The results obtained indicate that the bootstrapped SVR is a promising tool for providing reliable point and interval estimates, which can inform maintenance-related decisions on degrading components.
Pesquisa Operacional | 2007
Sérgio Rocha; Enrique López Droguett; Carlos Magno Couto Jacinto
The Generalized Renewal Process (GRP) is a class of probabilistic models that handles repair actions according to the reduction they provide on the real age of an equipment/system. GRP is an extension of the Renewal Process and Non Homogeneous Poisson Process and it will be used to evaluate repair actions regarding their efficacy degree. Considering that the times between failures follow a Weibull distribution, such an evaluation will be accomplished through the estimation of the GRP parameter distributions and uncertainty analysis on the expected number of failures through Monte Carlo simulation. Due to paucity of failure data, the probabilistic inference procedure will be executed through the Bayesian paradigm which allows for the use of other sources of information, besides the failure data, in the process of estimating the probability distribution on some parameter of interest.
winter simulation conference | 2006
Dalton Francisco de Andrade; P.A. Barbetta; P.Jd.F. Filho; N.Ad.M. Zunino; Carlos Magno Couto Jacinto
Nearly every well installation process nowadays relies on some sort of risk assessment study, given the high costs involved. Those studies focus mostly on estimating the total time required by the well drilling and completion operations, as a way to predict the final costs. Among the different techniques employed, the Monte Carlo simulation currently stands out as the preferred method. One relevant aspect which is frequently left out from simulation models is the dependence relationship among the processes under consideration. That omission can have a serious impact on the results of risk assessment and, consequently, on the conclusions drawn from them. In general, practitioners do not incorporate the dependence information because that is not always an easy task. This paper intends to show how Copula functions may be used as a tool to build correlation-aware Monte Carlo simulation models
Journal of Computer Science | 2014
Mariana Dehon Costa e Lima; Silvia Modesto Nassar; Pedro Ivo R. B. G. Rodrigues; Paulo José de Freitas Filho; Carlos Magno Couto Jacinto
Bayesian Network (BN) is a classification technique widely used in Artificial Intelligence. Its struct ure is a Direct Acyclic Graph (DAG) used to model the association of categorical variables. However, in cases w here the variables are numerical, a previous discretizat ion is necessary. Discretization methods are usuall y based on a statistical approach using the data distribution, such as division by quartiles. In this article we present a discretization using a heuristic that identifies ev ents called peak and valley. Genetic Algorithm was used to identify these events having the minimization of th e error between the estimated average for BN and th e actual value of the numeric variable output as the objecti ve function. The BN has been modeled from a database of Bit’s Rate of Penetration of the Brazilian pre-salt layer with 5 numerical variables and one categoric al variable, using the proposed discretization and the division of the data by the quartiles. The results show that the proposed heuristic discretization has higher accura cy than the quartiles discretization.
ieee international conference on fuzzy systems | 2014
Diego G. Rodrigues; Gabriel Moura; Carlos Magno Couto Jacinto; Paulo José de Freitas Filho; Mauro Roisenberg
This paper presents a novel neuro-fuzzy inference system, called RBFuzzy, capable of knowledge extraction and generation of highly interpretable Mamdani-type fuzzy rules. RBFuzzy is a four layer neuro-fuzzy inference system that takes advantage of the functional behavior of Radial Basis Function (RBF) neurons and their relationship with fuzzy inference systems. Inputs are combined in the RBF neurons to compound the antecedents of fuzzy rules. The fuzzy rules consequents are determined by the third layer neurons where each neuron represents a Mamdani-type fuzzy output variable in the form of a linguistic term. The last layer weights each fuzzy rule and generates the crisp output. An extension of the ant-colony optimization (ACO) algorithm is used to adjust the weights of each rule in order to generate an accurate and interpretable fuzzy rule set. For benchmarking purposes some experiments with classic datasets were carried out to compare our proposal with the EFuNN neuro-fuzzy model. The RBFuzzy was also applied in a real world oil well-log database to model and forecast the Rate of Penetration (ROP) of a drill bit for a given offshore well drilling section. The obtained results show that our model can reach the same level of accuracy with fewer rules when compared to the EFuNN, which facilitates understanding the operation of the system by a human expert.
Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability | 2015
Enrique López Droguett; Isis Didier Lins; Márcio das Chagas Moura; Enrico Zio; Carlos Magno Couto Jacinto
The formation of inorganic scale, particularly calcium carbonate (CaCO3), is a persistent and one of the most serious and costly problems in the oil and gas industries. This event may cause partial to complete plugging, block valves, tubing and flowlines, and then reduce the production rates. This article proposes the use of support vector regression to build a nonlinear mapping between a set of variables (surface cladding, material, temperature, pressure, brine composition, and fluid velocity) and the scale build-up. The support vector regression is fed with data gathered from laboratory tests carried out on coupons that simulate realistic downhole conditions encountered in oil well bores from the pre-salt fields in Brazil. The proposed failure prediction framework is comprehensive as it entails the stages of hyperparameter tuning, variable selection, and uncertainty analysis, which are addressed by a combination of particle swarm optimization and bootstrap with support vector regression. The obtained results suggest that the bootstrapped particle swarm optimization + support vector regression is a valuable tool that may be used to support condition-based maintenance-related decisions.
reliability and maintainability symposium | 2008
M. das Chagas Moura; Paulo Renato A. Firmino; Enrique López Droguett; Carlos Magno Couto Jacinto
System availability optimization is one of the main issues to oil production managers: the greater the system availability the greater the production profits are. Provided that preventive maintenance actions promote rejuvenation impact on availability indicator, this paper proposes an approach to maximize the mean availability by identifying an optimal maintenance policy for downhole optical monitoring systems, which are modeled according to non-homogeneous semi-Markov processes. In order to solve the resulting optimization problem constrained by system performance costs, new real-coded GA operators are also presented. The proposed approach is exemplified by means of an application to a real scenario in onshore oil wells in Brazil.