Luiz Alberto Vieira Dias
Instituto Tecnológico de Aeronáutica
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Luiz Alberto Vieira Dias.
international conference on information technology: new generations | 2010
Gabriel de Souza Pereira Moreira; Roberto Pepato Mellado; Denis Ávila Montini; Luiz Alberto Vieira Dias; Adilson Marques da Cunha
This paper describes a framework for a software internal quality measurement program with automatic metrics extraction. This framework was successfully implemented in an Industrial Software Factory. That was possible through the implementation of a proposed Continuous Integration CI) environment to periodically analyze source codes and extract metrics. These metrics were consolidated in a Data Warehouse by allowing On-line Analytical Processing (OLAP) and Key Performance Indicator (KPI) analysis with high-performance and user-friendly interface. The measurement program followed GQ(I)M paradigm for metrics selection to ensure that collected metrics are relevant from the Software Factory goals perspective. Finally, the Measurement and Analysis Process Area of the Capability Maturity Model integration - CMMi was used for measurement and analysis planning and implementation.
international conference on information technology: new generations | 2009
Gláucia Braga e Silva; Breno Lisi Romano; Henrique Fernandes de Campos; Ricardo Godoi Vieira; Adilson Marques da Cunha; Luiz Alberto Vieira Dias
This paper tackles a logical database integration process implemented for existing databases from the Brazilian National Water Agency (ANA). It describes an important part of a Brazilian Project between ANA and Brazilian Aeronautics Institute of Technology (ITA). The integration process started with a detailed analysis of four existing databases, followed by logical modeling elaboration and integration. Its major contributions are the description of main processes, documentations, model auditing, and human resources involved. At the end, the best practices used for applying main data modeling and database design techniques together with a successful use of a modeling tools are also presented.
international conference on information technology: new generations | 2015
Emanuel Mineda Carneiro; Luiz Alberto Vieira Dias; Adilson Marques da Cunha; Lineu Fernando Stege Mialaret
Data normalization for use in Artificial Neural Networks often requires extensive statistical analysis. This paper presents an initial investigation of a case study involving credit card fraud detection, where Cluster Analysis was applied to data normalization. Early results obtained from the use of Artificial Neural Networks and Cluster Analysis on fraud detection has shown that neuronal inputs can be reduced by clustering attributes.
ieee/aiaa digital avionics systems conference | 2006
Denis Silva Loubach; João Carlos Silva Nobre; Adilson Marques da Cunha; Luiz Alberto Vieira Dias; Marcos Ribeiro do Nascimento; Walter Abrahão dos Santos
This paper reports an academic experience at the Brazilian Aeronautical Institute of Technology using automated software testing tools applied to a critical real time embedded systems. The work uses a problem-based learning
ieee/aiaa digital avionics systems conference | 2011
Guilherme Correa; Adilson Marques da Cunha; Luiz Alberto Vieira Dias; Osamu Saotome
PBL teaching methodology, a Rational unified process - RUP tailoring, and the IBM-Rational Quality Architect RealTime and the IBM-Rational Test Real Time tools. This academic experience has been originated on the needs and specifications of the Brazilian Aeronautics and Space Institute - IAE and the Brazilian National Institute for Space Research - INPE. The case study is based upon three ongoing realistic software projects, all of them using real time embedded software respectively named unmanned aerial vehicles - UAV, student satellite - SSAT, and ground control station - GCS. This experience has involved 15 senior computer engineering undergrads and 18 graduate students to develop, test, verify, and validate the system, in just 17 academic weeks, and four aggregation levels: computer software units -CSU, computer software components - CSC, computer software configuration items
international conference on information technology: new generations | 2010
Etiene Lamas; Érica Ferreira; Marcos Ribeiro do Nascimento; Luiz Alberto Vieira Dias
CSCI in just one computer software system - CSS demanding software engineering state-of-the-art processes, and real time notations and tools. The major contribution of this paper is the proper utilization of available tools to perform automated testing, improving deliverable software quality, reliability and safety, and increasing the expertise of involved professionals, as well as reducing the necessary time to perform unit, integration and system testing
international conference on information technology: new generations | 2009
Claudio Goncalves Bernardo; Denis Ávila Montini; Danilo Douradinho Fernandes; Daniela America da Silva; Luiz Alberto Vieira Dias; Adilson Marques da Cunha
The use of automated generated code tools has been increasing in the last years mainly because it helps engineers to faster and better develop documented software in comparison with hand coded development. Nowadays, there are many tools available from different vendors. However, the most used tools for critical environments and real-time applications are the Rational Rose Real Time (RRRT) and the Rational Rhapsody. Recently, National Instruments, the LabVIEW software vendor, has released a toolkit to develop C code from its visual programming language. This represented a breakthrough for LabVIEW developers, allowing verification of what is being properly developed. Mathworks, the Matlab software vendor, has provided a similar toolkit to the LabVIEW. A case study comparing source codes generated by CASE Tools with hand coded was presented at the ITNG 2008 Conference. In this case study, authors have concluded that hand-made source code is less complex than CASE tools source codes. However, the cost/benefit becomes better when using CASE tools. The main purpose of this paper is to provide an insight about automated generated code using Model Based Development (MBD) tools (LabVIEW, Matlab, and Rational Rose Real Time). The case study presents the modeling of a 100 points sine wave application. A comparison between code metrics is performed, in order to verify which tool best fits into your project.
Archive | 2018
Romulo Alceu Rodrigues; Lineu Alves Lima Filho; Gildarcio Sousa Goncalves; Lineu Fernando Stege Mialaret; Adilson Marques da Cunha; Luiz Alberto Vieira Dias
This paper presents a framework entitled Organizational Testing Management Maturity Model (OTM3). The proposed framework is a set of structures to support the development and testing of Software Product Lines. This set follows the Experimental Software Engineering concepts. OTM3 is a framework for interactive, incremental, and continuous models. It shall provide the information for the organization, and a method to identity, establish and keep the capabilities demanded by test maturity models. This is achieved through: patterns, measures, controls, and software engineering best practices.
international conference on information technology: new generations | 2014
Andre Sarkis; Luiz Alberto Vieira Dias
This article describes a methodology named Causal Analysis and Resolution (CAR) based on Goals, Questions, and Metrics (GQM) principles. Indicators are defined based on metrics for a decision-making process. Its main contributions are the construction of an information process system model and a prototype, involving GQM approach, in a quantitative definition and qualitative metrics. The CAR methodology is a Process Area (PA) of the Capability Maturity Model Integrated (CMMi) for software development from Carnegie Mellon University. This PA was used to eliminate systematic error cases listed in a Technical Report (TR) generated by CAR. An information system model was created to allow the elimination of defects, errors, and failures in a design pattern named IO Manager, during the test phase, and before its publication in a components library. The prototype was created using Rational Rose RealTime (RRRT) with focus on verification tests. It provided a quality assessment to the IO Manager design pattern. The use of this methodology was based on GQM and CAR along with the information process system model. The developed prototype aimed to monitor errors on design pattern tests in real-time embedded system of a software production line.
international conference on information technology: new generations | 2013
Marcelo Paiva Ramos; Gustavo Ravanhani Matuck; Ciro Fernandes Matrigrani; Samoel Mirachi; Eliezer Segeti; Marco Leite; Adilson Marques da Cunha; Luiz Alberto Vieira Dias
The project entitled as Big Data, Internet of Things, and Mobile Devices, in Portuguese Banco de Dados, Internet das Coisas e Dispositivos Moveis (BDIC-DM) was implemented at the Brazilian Aeronautics Institute of Technology (ITA) on the 1st Semester of 2015. It involved 60 graduate students within just 17 academic weeks. As a starting point for some features of real time Online Transactional Processing (OLTP) system, the Relational Database Management System (RDBMS) MySQL was used along with the NoSQL Cassandra to store transaction data generated from web portal and mobile applications. Considering batch data analysis, the Apache Hadoop Ecosystem was used for Online Analytical Processing (OLAP). The infrastructure based on the Apache Sqoop tool has allowed exporting data from the relational database MySQL to the Hadoop File System (HDFS), while Python scripts were used to export transaction data from the NoSQL database to the HDFS. The main objective of the BDIC-DM project was to implement an e-Commerce prototype system to manage credit card transactions, involving large volumes of data, by using different technologies. The used tools involved generation, storage, and consumption of Big Data. This paper describes the process of integrating NoSQL and relational database with Hadoop Cluster, during an academic project using the Scrum Agile Method. At the end, processing time significantly decreased, by using appropriate tools and available data. For future work, it is suggested the investigation of other tools and datasets.