Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joel M. Wilf is active.

Publication


Featured researches published by Joel M. Wilf.


empirical software engineering and measurement | 2013

The Value of Certifying Software Release Readiness: An Exploratory Study of Certification for a Critical System at JPL

Daniel Port; Joel M. Wilf

A software release is a decision to deliver code to an organization outside of the development team usually for testing or operational purposes. For critical systems this can be a risky decision where failure to pass a test or holding up the project schedule can have a major impact. The release decision is primarily based on the understanding on the level of quality the software currently has (be it high quality, low quality, or unknown). But for large, complex systems, determining the level of quality with high confidence is a challenge. A poor understanding of the confidence in the quality level increases decision risk leading potentially to a bad release decision that possibly could have been avoided had the confidence in the quality been better known. Certification of release readiness attempts to address this risk by building confidence in the quality level. But this comes at a cost and the relationship between certification and decision risk reduction has not been well understood. This work describes our experience investigating the value of certification and our efforts to improve the mandated software readiness certification record (SRCR) process. A well known critical system at JPL is used as a case study to exemplify this effort.


hawaii international conference on system sciences | 2011

A Study on the Perceived Value of Software Quality Assurance at JPL

Daniel Port; Joel M. Wilf

As software quality assurance (SQA) moves from being a compliance-driven activity to one driven by value, it is important that all stakeholders involved in a software development project have a clear understanding of how SQA contributes to their efforts. However, a recent study at JPL indicates that different groups of stakeholders have significantly different perceptions about the value of SQA activities and their expectations of what constitutes SQA success. This lack of a common understanding of value has fragmented SQA efforts. Activities are driven by the desires of whichever group of stakeholders happens to hold the greatest influence at the moment leading the project as a whole not realizing the full or needed value of SQA. We examine this and other results of the recent study and how these impact both the real and the perceived value of SQA.


Procedia Computer Science | 2014

The Value Proposition for Assurance of JPL Systems

Dan Port; Joel M. Wilf

Abstract This paper presents a value proposition for systems assurance. The need for a value proposition is motivated by common misconceptions about the definition of assurance and the value of performing systems assurance activities. The focus of the value proposition is that assurance reduces uncertainty so that projects can make more confident decisions about their systems. Applying the value proposition has led to insights into the nature of assurance and has improved the practice of software assurance, where it has been applied at the Jet Propulsion Laboratory (JPL). Ongoing work on using the value proposition for “value-based tailoring” of requirements and integrating value considerations into assurance cost models are also discussed.


hawaii international conference on system sciences | 2013

Visualization of Software Assurance Information

Martin S. Feather; Joel M. Wilf

During the conduct of Software Assurance on a software development project, data is gathered on both the software being developed, and the development processes being followed. It is from this information that Software Assurance derives insights into the quality of the software itself and the efficacy of the development process. For large software developments such data can be voluminous, making deriving and conveying insights challenging. This motivates our ongoing efforts to apply information visualization techniques to software assurance data. While visualization techniques have long been applied to software itself, the application to software development processes and the data they yield is relatively novel. We report on several such applications and the insights they revealed. We offer some suggestions for the further investigation of information visualization techniques applied to assurance data.


hawaii international conference on system sciences | 2013

Tool Use within NASA Software Quality Assurance

Denise Shigeta; Daniel Port; Joel M. Wilf

As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea -- it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase managements confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.


hawaii international conference on system sciences | 2017

A Decision-Theoretic Approach to Measuring Security

Dan Port; Joel M. Wilf

The question “is this system secure?” is notoriously difficult to answer. The question implies that there is a system-wide property called “security,” which we can measure with some meaningful threshold of sufficiency. In this concept paper, we discuss the difficulty of measuring security sufficiency, either directly or through proxy such as the number of known vulnerabilities. We propose that the question can be better addressed by measuring confidence and risk in the decisions that depend on security. A novelty of this approach is that it integrates use of both subjective information (e.g. expert judgment) and empirical data. We investigate how this approach uses well-known methods from the discipline of decision-making under uncertainty to provide a more rigorous and useable measure of security sufficiency.


hawaii international conference on system sciences | 2016

Developing a Value-Based Methodology for Satisfying NASA Software Assurance Requirements

Daniel Port; Joel M. Wilf; Madeline Diep; Carolyn B. Seaman; Martin S. Feather

NASA imposes a multitude of quality process requirements on the development of its software systems. One source of such is the Software Quality Assurance standard. All NASA sponsored projects are expected to implement these requirements. However given the diversity of projects and practices at different NASA centers it is impossible to a-priori dictate how these requirements are to be economically satisfied on a given project. Under the auspices of NASAs Software Assurance Research Program the authors have been developing a value-based methodology to guide practitioners in defensibly and economically planning and executing assurance effort to satisfy this standard. The methodology exploits the intimate relationship between assurance value and risk-informed decision making. This paper describes this relationship, the value-based methodology for scaling assurance efforts, support for using the methodology, and our practice-based validation of the approach.


hawaii international conference on system sciences | 2016

Introduction to IS Risk and Decision-Making Minitrack

Daniel Port; Joel M. Wilf

Minitrack introduction.


Innovations in Systems and Software Engineering | 2016

Metrics for V&V of cyber defenses

Martin S. Feather; Joel M. Wilf; Joseph Priest

There is a need for a disciplined approach for evaluating a cyber defense prior to its introduction into an operational environment. This is necessary to assess whether the benefits of the defense will be worth its costs and risks. A traditional V&V workflow is adapted for this purpose. The considerations it must take into account are described, as is the collection and presentation of pertinent metrics. An example of this workflow is given for a cyber defense against a “reconnaissance attack” that threatens information integrity and confidentiality.


hawaii international conference on system sciences | 2013

Introduction to New Directions in Software Assurance Minitrack

Daniel Port; Jairus Hihn; Joel M. Wilf

Introduction to New Directions in Software Assurance Minitrack.

Collaboration


Dive into the Joel M. Wilf's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dan Port

University of Hawaii at Manoa

View shared research outputs
Top Co-Authors

Avatar

Martin S. Feather

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jairus Hihn

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Joseph Priest

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Madeline Diep

University of Nebraska–Lincoln

View shared research outputs
Researchain Logo
Decentralizing Knowledge