Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nicola Boffoli is active.

Publication


Featured researches published by Nicola Boffoli.


product focused software process improvement | 2004

Managing Software Process Improvement (SPI) through Statistical Process Control (SPC)

Teresa Baldassarre; Nicola Boffoli; Danilo Caivano; Giuseppe Visaggio

Measurement based software process improvement is nowadays a mandatory activity. This implies continuous process monitoring in order to predict its behavior, highlight its performance variations and, if necessary, quickly react to them. Process variations are due to common causes or assignable ones. The former are part of the process itself while the latter are due to exceptional events that result in an unstable process behavior and thus in less predictability. Statistical Process Control (SPC) is a statistical based approach able to determine whether a process is stable or not by discriminating between the presence of common cause variation and assignable cause variation. It is a well-established technique, which has shown to be effective in manufacturing processes but not yet in software process contexts. Here experience in using SPC is not mature yet. Therefore a clear understanding of the SPC outcomes still lacks. Although many authors have used it in software, they have not considered the primary differences between manufacturing and software process characteristics. Due to such differences the authors sustain that SPC cannot be adopted as is but must be tailored. In this sense, we propose an SPC-based approach that reinterprets SPC, and applies it from a Software Process point of view. The paper validates the approach on industrial project data and shows how it can be successfully used as a decision support tool in software process improvement.


Proceedings of the Third International Workshop on Product LinE Approaches in Software Engineering | 2012

Driving flexibility and consistency of business processes by means of product-line engineering and decision tables

Nicola Boffoli; Danilo Caivano; Daniela Castelluccia; Giuseppe Visaggio

Todays organizations are increasingly pushed to be distributed by space, time and capabilities and are involved to leverage synergies by integrating their business processes in order to produce new value-added products and services. Here the importance of integrating whole processes rather than simply integrate databases or software applications. Seeing the duality between products and processes, we propose to exploit flexibility provided by the product-line engineering approach for modeling business processes as a Business Process Line (BPL) in order to capture process variability, promote reuse and integration and provide the capacity to anticipate process changes. To support process evolution and consistency, we suggest the use of decision tables to elicit, track and manage all the emerging decision points during business process modeling, with the purpose of maintaining the relationships among business needs, environmental changes and process tasks. In a real case study we practiced the proposed methodology by leveraging the synergy of feature models, variability mechanisms and decision tables. The results prove that the BPL satisfies the requirements for business process flexibility.


SC'12 Proceedings of the 11th international conference on Software Composition | 2012

Business process lines and decision tables driving flexibility by selection

Nicola Boffoli; Danilo Caivano; Daniela Castelluccia; Giuseppe Visaggio

A major challenge faced by organizations is to better capture business strategies into products and services at an ever-increasing pace as the business environment constantly evolves. We propose a novel methodology base on a Business Process Line (BPL) engineering approach to inject flexibility into process modeling phase and promote reuse and flexibility by selection. Moreover we suggest a decision-table (DT) formalism for eliciting, tracking and managing the relationships among business needs, environmental changes and process tasks. In a real case study we practiced the proposed methodology by leveraging the synergy of feature models, variability mechanisms and decision tables. The application of DT-based BPL engineering approach proves that the Business Process Line benefits from fundamental concepts like composition, reusability and adaptability and satisfies the requirements for process definition flexibility.


international conference on software maintenance | 2006

SPEED: Software Project Effort Evaluator based on Dynamic-calibration

Maria Teresa Baldassarre; Nicola Boffoli; Danilo Caivano; Giuseppe Visaggio

Effort estimation is a long faced problem, but, in spite of the amount of research spent in this field, it still remains an open issue in the software engineering community. In two previous works the authors proposed an approach named dynamic calibration for effort estimation of software projects. In this paper they present a tool named SPEED that implements the dynamic calibration approach


ACM Sigsoft Software Engineering Notes | 2014

Service-oriented product lines: a systematic mapping study

Daniela Castelluccia; Nicola Boffoli

Software product line engineering and service-oriented architectures both enable organizations to capitalize on reuse of existing software assets and capabilities and improve competitive advantage in terms of development savings, product flexibility, time-to-market. Both approaches accommodate variation of assets, including services, by changing the software being reused or composing services according a new orchestration. Therefore, variability management in Service-oriented Product Lines (SoPL) is one of the main challenges today. In order to highlight the emerging evidence-based results from the research community, we apply the well-defined method of systematic mapping in order to populate a classification scheme for the SoPL field of interest. The analysis of results throws light on the current open issues. Moreover, different facets of the scheme can be combined to answer more specific research questions. The report reveals the need for more empirical research able to provide new metrics measuring efficiency and efficacy of the proposed models, new methods and tools supporting variability management in SoPL, especially during maintenance and verification and validation. The mapping study about SoPL opens further investigations by means of a complete systematic review to select and validate the most efficient solutions to variability management in SoPL.


Journal of e-learning and knowledge society | 2011

The Lifelong Learning in the University: Learning Networks and Knowledge Transferring

Pasquale Ardimento; Nicola Boffoli; Vito Nicola Convertini; Giuseppe Visaggio

Practitioners must continually update their skills to align their professional profile to market needs and social organizations in which they live, both characterized by extreme variability and volatility. In this scenario, Universities, the traditional Institution for the knowledge transferring, assume the role of an institution dedicated to lifelong learning. However the lifelong learning highlights several issues that make it unsuitable to the university instructional models. In order to face this problem the authors propose to use a Learning Network model integrating a Knowledge Base Experience (Prometheus) to support distribution of contents and to the enhancement knowledge transferring. The results of an empirical experimentation encourage their adoption in real contexts.


Archive | 2010

Statistical Process Control for Software: Fill the Gap

Nicola Boffoli; Maria Teresa Baldassarre; Danilo Caivano

The characteristic of software processes, unlike manufacturing ones, is that they have a very high human-centered component and are primarily based on cognitive activities. As so, each time a software process is executed, inputs and outputs may vary, as well as the process performances. This phenomena is better identified in literature with the terminology of “Process Diversity” (IEEE, 2000). Given the characteristics of a software process, its intrinsic diversity implies the difficulty to predict, monitor and improve it, unlike what happens in other contexts. In spite of the previous observations, Software Process Improvement (SPI) is a very important activity that cannot be neglected. To face these problems, the software engineering community stresses the use of measurement based approaches such as QIP/GQM (Basili et al., 1994) and time series analysis: the first approach is usually used to determine what improvement is needed; the time series analysis is adopted to monitor process performances. As so, it supports decision making in terms of when the process should be improved, and provides a manner to verify the effectiveness of the improvement itself. A technique for time series analysis, well-established in literature, which has given insightful results in the manufacturing contexts, although not yet in software process ones is known as Statistical Process Control (SPC) (Shewhart, 1980; Shewhart, 1986). The technique was originally developed by Shewhart in the 1920s and then used in many other contexts. The basic idea it relies on consists in the use of so called “control charts” together with their indicators, called run tests, to: establish operational limits for acceptable process variation; monitor and evaluate process performances evolution in time. In general, process performance variations are mainly due to two types of causes classified as follows:  Common cause variations: the result of normal interactions of people, machines, environment, techniques used and so on.  Assignable cause variations: arise from events that are not part of the process and make it unstable. In this sense, the statistically based approach, SPC, helps determine if a process is stable or not by discriminating between common cause variation and assignable cause variation. We can classify a process as “stable” or “under control” if only common causes occur. More precisely, in SPC data points representing measures of process performances are collected. 8


Archive | 2013

Tabularizing the Business Knowledge: Modeling, Maintenance and Validation

Nicola Boffoli; Daniela Castelluccia; Giuseppe Visaggio

Achieving business flexibility implies to explicitly represent business knowledge and make it easy to understand for decision-makers. There is a renewed interest for decision tables as knowledge modeling formalism able to provide representation of the relationships among business conditions, actions and decisions with completeness and consistency. We explore the benefits of decision tables applied to modeling and management of business rules and constraints, finding the major advantages in their compact formalization, safe maintenance and automated validation.


ACM Sigsoft Software Engineering Notes | 2017

Environmental Big Data: a systematic mapping study

Daniela Castelluccia; Enrico Giacinto Caldarola; Nicola Boffoli

Big data sets and analytics are increasingly being used by government agencies, non-governmental organizations, and privatecompanies to forward environmental protection. Improving energy efficiency, promoting environmental justice, tracking climate change, and monitoring water quality are just a few of the objectives being furthered by the use of Big Data. The authors provide a more detailed analysis of the emerging evidence-based insights on Environmental Big Data (EBD), by applying the well-defined method of systematic mapping. The analysis of results throws light on the current open issues of Environmental Big Data. Moreover, different facets of the study can be combined nto answer more specific research questions. The report reveals the need for more empirical research able to provide new metrics measuring efficiency and effectiveness of the proposed analytics and new methods and tools supporting data processing workflow in EBD


Archive | 2016

Enforcing Software Developers’ Productivity by Using Knowledge and Experience

Pasquale Ardimento; Maria Teresa Baldassarre; Nicola Boffoli; Danilo Caivano; Michele Scalera; Giuseppe Visaggio

Objective—Explore the relation between developers, a Knowledge Experience Base (KEB) called PROMETHEUS and their presentation in the development of enterprise applications used to propose a theory that expresses relations based on empirical evidences. Methods—Case study carried out in a real context with 5 development teams each of 6 staff members, who have in turn carried out evolutive maintenance tasks on 5 Software Packages commercialized by the enterprise with 5 different process models. Results—In the 5 experimental teams that used KEB productivity is almost double compared to previous data without the KEB. Conclusions—We can assume that the theory is extendible to the development process according to which using KEB in maintenance processes improves developer productivity as it mitigates the errors made due to the amount of decisions taken during project execution. Experience collected in PROMETHEUS becomes part of the organizational culture, being it formalized.

Collaboration


Dive into the Nicola Boffoli's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marta Cimitile

Sapienza University of Rome

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge