John C. Kelly
California Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by John C. Kelly.
Journal of Systems and Software | 1992
John C. Kelly; Joseph S. Sherif; Jonathan M. Hops
Abstract Software inspection is a technical evaluation process for finding and removing defects in requirements, design, code, and tests. The Jet Propulsion Laboratory (JPL), California Institute of Technology, tailored Fagans original process of software inspections to conform to its software development environment in 1987. Detailed data collected from 203 inspections during the first three years of experience at JPL included averages of staff time expended, pages covered, major and minor defects found, and inspection team size. The data were tested for homogeneity. Randomized samples belonging to the various phases or treatments were analyzed using the completely randomized block design analysis of variance (α = 0.05). The results showed a significantly higher density of defects during requirements inspections. The number of defect densities decreased exponentially as the work products approached the coding phase because defects were fixed when detected and did not migrate to subsequent phases. This resulted in a relatively flat profile for cost to fix. Increasing the pace of the inspection meeting decreased the density of defects found. This relationship held for major and minor defect densities, although it was more pronounced for minor defects.
annual software engineering workshop | 2001
David P. Gilliam; John D. Powell; John C. Kelly; Matt Bishop
The paper presents joint work by the California Institute of Technologys Jet Propulsion Laboratory and the University of California at Davis (UC Davis) sponsored by the National Aeronautics and Space Administration Goddard Independent Verification and Validation Facility to develop a security assessment instrument for the software development and maintenance life cycle. The paper presents research on the generation of a software security assessment instrument to aid developers in assessing and assuring the security of software in the development and maintenance lifecycles.
international workshop on software specification and design | 2000
Steven L. Cornford; Martin S. Feather; John C. Kelly; Timothy W. Larson; Burton Sigal; James D. Kiper
An assessment methodology is described and illustrated. This methodology separates assessment into the following phases: (1) elicitation of requirements; (2) elicitation of failure modes and their impact (risk of loss of requirements); (3) elicitation of failure mode mitigations and their effectiveness (degree of reduction of failure modes); and (4) calculation of outstanding risk taking the mitigations into account. This methodology, with accompanying tool support, has been applied to assist in planning the engineering development of advanced technologies. Design assessment features prominently in these applications. The overall approach is also applicable to development assessment (of the development process to be followed to implement the design). Both design and development assessments are demonstrated on hypothetical scenarios based on the workshops TRMCS case study. TRMCS information has been entered into the assessment support tool, and serves as illustration throughout.
workshops on enabling technologies infrastracture for collaborative enterprises | 2001
David P. Gilliam; John C. Kelly; John D. Powell; Matt Bishop
The paper discusses joint work by the California Institute of Technologys Jet Propulsion Laboratory and the University of California at Davis (CC Davis) sponsored by the National Aeronautics and Space Administration to develop a security assessment instrument for the software development and maintenance life cycle. The assessment instrument is a collection of tools and procedures to support development of secure software. Specifically, the instrument offers a formal approach for engineering network security into software systems and application throughout the software development and maintenance life cycle. The security assessment instrument includes a Vulnerability Matrix (VMatrix) with platform/application, and signature fields in a database. The information in the VMatrix has become the basis for the Database of Vulnerabilities, Exploits, and Signatures (DOVES) at UC Davis. The instrument also includes a set of Security Assessment Tools (SAT), including the development of a property-based testing tool by UC Davis, to slice software code looking for specific vulnerability properties. A third component of the research is an investigation into the verification of software designs for compliance to security properties. This is based on innovative model checking approaches that will facilitate the development and verification of software security models.
conference on software engineering education and training | 2000
Ken Abernethy; John C. Kelly; Ann E. Kelley Sobel; James D. Kiper; John D. Powell
Accurate and complete requirements specifications are crucial for the design and implementation of high-quality software. Unfortunately, the articulation and verification of software system requirements remains one of the most difficult and error-prone tasks in the software development lifecycle. The use of formal methods, based on mathematical logic and discrete mathematics, holds promise for improving the reliability of requirements articulation and modeling. However, formal modeling and reasoning about requirements has not typically been a part of the software analysts education and training, and because the learning curve for the use of these methods is nontrivial, adoption of formal methods has proceeded slowly. As a consequence, technology transfer is a significant issue in the use of formal methods. In this paper, several efforts undertaken at NASA aimed at increasing the accessibility of formal methods are described. These include the production of the following: two NASA guidebooks on the concepts and applications of formal methods, a body of case studies in the application of formal methods to the specification of requirements for actual NASA projects, and course materials for a professional development course introducing formal methods and their application to the analysis and design of software-intensive systems. In addition, efforts undertaken at two universities to integrate instruction on formal methods based on these NASA materials into the computer science and software engineering curricula are described.
Microelectronics Reliability | 1992
Yosef S. Sherif; John C. Kelly
Abstract The software inspection process was created for the dual purpose of improving software quality and increasing programmers productivity. This paper puts forward formal inspections as an alternative to and a better method than technical walkthroughs in the software life cycle reviewing process. Examples of benefits gained in the development of defect-free software by utilizing formal inspections are cited.
conference on scientific computing | 1992
Ken Abernethy; John C. Kelly
Object-oriented analysis is the newest component of a proposed object-oriented software life cycle methodology. In this paper, we make comparisons between the standard data flow diagram (DFD) models and the newly introduced object models within the context of an existing moderately complex (approx. 65,000 lines) software project. In particular, we compare the complexities of competing models for the project domain using some simple metrics.
workshops on enabling technologies: infrastracture for collaborative enterprises | 2000
David P. Gilliarn; John C. Kelly; Matt Bishop
Archive | 1999
Martin S. Feather; John C. Kelly; James D. Kiper
annual software engineering workshop | 2002
John C. Kelly