Madeline Diep
University of Nebraska–Lincoln
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Madeline Diep.
Archive | 2013
Forrest Shull; Davide Falessi; Carolyn B. Seaman; Madeline Diep; Lucas Layman
In this chapter, we discuss recent progress and opportunities in empirical software engineering by focusing on a particular technology, Technical Debt (TD), which ties together many recent developments in the field. Recent advances in TD research are providing empiricists the chance to make more sophisticated recommendations that have observable impact on practice.
empirical software engineering and measurement | 2013
Lucas Layman; Madeline Diep; Meiyappan Nagappan; Janice Singer; Robert DeLine; Gina Venolia
We know surprisingly little about how professional developers define debugging and the challenges they face in industrial environments. To begin exploring professional debugging challenges and needs, we conducted and analyzed interviews with 15 professional software engineers at Microsoft. The goals of this study are: 1) to understand how professional developers currently use information and tools to debug, 2) to identify new challenges in debugging in contemporary software development domains (web services, multithreaded/multicore programming), and 3) to identify the improvements in debugging support desired by these professionals that are needed from research. The interviews were coded to identify the most common information resources, techniques, challenges, and needs for debugging as articulated by the developers. The study reveals several debugging challenges faced by professionals, including: 1) the interaction of hypothesis instrumentation and software environment as a source of debugging difficulty, 2) the impact of log file information on accurate debugging of web services, and 3) the mismatch between the sequential human thought process and the non-sequential execution of multithreaded environments as source of difficulty. The interviewees also describe desired improvements to tools to support debugging, many of which have been discussed in research but not transitioned to practice.
empirical software engineering and measurement | 2010
Victor R. Basili; Marvin V. Zelkowitz; Lucas Layman; Kathleen Dangle; Madeline Diep
We report on a preliminary case study to examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Our goal is to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. Our purpose was two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to identify potential risks due to incorrect application of the safety process, deficiencies in the safety process, or the lack of a defined process. One early outcome of this work was to show that there are structural deficiencies in collecting valid safety data that make software safety different from hardware safety. In our conclusions we present some of these deficiencies.
empirical software engineering and measurement | 2012
Forrest Shull; Carolyn B. Seaman; Madeline Diep
A significant body of knowledge concerning software inspection practice indicates that the value of inspections varies widely both within and across organizations. Inspection effectiveness and efficiency may be affected by a variety of factors such as inspection planning, the type of software, the developing organization, and many others. In the early 1990s a governmental organization developing complex and highly critical software systems formulated heuristics for inspection planning based on best practices and their early inspection data. Since the development context at the organization has changed in some ways since the heuristics were proposed, it is important to assess whether the heuristics are still a suitable guideline to use. To investigate this question, we statistically evaluated the differences in effectiveness and efficiency between inspections that adhered to the heuristics and ones that did not. Our analysis revealed no significant difference in effectiveness or efficiency for most heuristics. We also learned that compliance with the heuristics is diminishing over time.
hawaii international conference on system sciences | 2016
Daniel Port; Joel M. Wilf; Madeline Diep; Carolyn B. Seaman; Martin S. Feather
NASA imposes a multitude of quality process requirements on the development of its software systems. One source of such is the Software Quality Assurance standard. All NASA sponsored projects are expected to implement these requirements. However given the diversity of projects and practices at different NASA centers it is impossible to a-priori dictate how these requirements are to be economically satisfied on a given project. Under the auspices of NASAs Software Assurance Research Program the authors have been developing a value-based methodology to guide practitioners in defensibly and economically planning and executing assurance effort to satisfy this standard. The methodology exploits the intimate relationship between assurance value and risk-informed decision making. This paper describes this relationship, the value-based methodology for scaling assurance efforts, support for using the methodology, and our practice-based validation of the approach.
The Art and Science of Analyzing Software Data | 2015
Madeline Diep; Linda Esker; Davide Falessi; Lucas Layman; Michele Shaw; Forrest Shull
Abstract Software data analytics is key for helping stakeholders make decisions, and thus establishing a measurement and data analysis program is a recognized best practice within the software industry. However, practical implementation of measurement programs and analytics in industry is challenging. In this chapter, we discuss real-world challenges that arise during the implementation of a software measurement and analytics program. We also report lessons learned for overcoming these challenges and best practices for practical, effective data analysis in industry. The lessons learned provide guidance for researchers who wish to collaborate with industry partners in data analytics, as well as for industry practitioners interested in setting up and realizing the benefits of an effective measurement program.Software data analytics is key for helping stakeholders make decisions, and thus establishing a measurement and data analysis program is a recognized best practice within the software industry. However, practical implementation of measurement programs and analytics in industry is challenging. In this chapter, we discuss real-world challenges that arise during the implementation of a software measurement and analytics program. We also report lessons learned for overcoming these challenges and best practices for practical, effective data analysis in industry. The lessons learned provide guidance for researchers who wish to collaborate with industry partners in data analytics, as well as for industry practitioners interested in setting up and realizing the benefits of an effective measurement program.
AIAA Infotech @ Aerospace | 2015
Linda J. Esker; Madeline Diep; Frank Herman
Increasingly, large and complex government acquisitions, including critical aerospace systems, make use of Commercial Off-The-Shelf (COTS) hardware and software products. Managing and tracking such projects are especially challenging because there is little guidance available on what to measure to adequately understand the progress and quality of the development of COTS-intensive software components while the system is being developed. Taking clues for COTS estimation and COTS acquisition measurement research together with experiences on a large ground space communications system, this paper reports on our experience in implementing a measurement program to manage software development activities of a COTS-intensive system in an Aerospace domain – focusing mainly on the aspect of managing the COTS development.
IEEE Software | 2010
Forrest Shull; Grigori Melnik; Burak Turhan; Lucas Layman; Madeline Diep; Hakan Erdogmus
Archive | 2010
Burak Turhan; Lucas Layman; Madeline Diep; Hakan Erdogmus; Forrest Shull
Annual International Symposium of the International Council on Systems Engineering, INCOSE 2012 and the 8th Biennial European Systems Engineering Conference, EuSEC 2012 | 2012
Manuel Mastrofini; Giovanni Cantone; Carolyn B. Seaman; Forrest Shull; Madeline Diep; Davide Falessi