Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Warren Harrison is active.

Publication


Featured researches published by Warren Harrison.


IEEE Transactions on Software Engineering | 1992

An entropy-based measure of software complexity

Warren Harrison

It is proposed that the complexity of a program is inversely proportional to the average information content of its operators. An empirical probability distribution of the operators occurring in a program is constructed, and the classical entropy calculation is applied. The performance of the resulting metric is assessed in the analysis of two commercial applications totaling well over 130000 lines of code. The results indicate that the new metric does a good job of associating modules with their error spans (averaging number of tokens between error occurrences). >


international conference on software maintenance | 1990

Insights on improving the maintenance process through software measurement

Warren Harrison; Curtis R. Cook

The authors develop a model of software maintenance based upon an objective decision rule which determines whether a given software module can be effectively modified or whether it should instead be rewritten. Completely rewriting a module can be expensive, but it can be even more expensive if the modules structure has been severely degraded over successive maintenance activities. A module that is likely to experience significant maintenance activity is called change prone. The authors suggest that early identification of change-prone modules through the use of change measures across release cycles can be an effective technique in allocating maintenance resources.<<ETX>>


IEEE Software | 2006

Giving Back

Warren Harrison

The outgoing EIC for IEEE Software reflects on the personal and professional rewards of volunteerism that sustain the technical community and society at large.


Journal of Management Information Systems | 1988

Using software metrics to allocate testing resources

Warren Harrison

Abstract:Several methods of allocating testing resources among modules within a large software development project are examined and evaluated. The objective of each of the allocation schemes is to distribute resources across modules so that a uniform percentage of resources is allocated based on the predicted number of errors each module contains (i.e., resources per error are the same for each module). By far, the most uniform allocation of resources per error is obtained through the use of software complexity metrics. However, the allocation was not perfect in any of the strategies considered, suggesting that additional work must be performed to arrive at better approaches to allocation.


Journal of Systems and Software | 1987

A micro/macro measure of software complexity

Warren Harrison; Curtis R. Cook

Abstract A software complexity metric is a quantitative measure of the difficulty of comprehending and working with a specific piece of software. The majority of metrics currently in use focus on a programs “microcomplexity.” This refers to how difficult the details of the software are to deal with. This paper proposes a method of measuring the “macrocomplexity,” i.e., how difficult the overall structure of the software is to deal with, as well as the microcomplexity. We evaluate this metric using data obtained during the development of a compiler/environment project, involving over 30,000 lines of C code. The new metrics performance is compared to the performance of several other popular metrics, with mixed results. We then discuss how these metrics, or any other metrics, may be used to help increase the project management efficiency.


Journal of Systems and Software | 2004

A flexible method for maintaining software metrics data: a universal metrics repository ☆

Warren Harrison

Abstract A neglected aspect of software measurement programs is what will be done with the metrics once they are collected. Often databases of metrics information tend to be developed as an afterthought, with little, if any concessions to future data needs, or long-term, sustaining metrics collection efforts. A metric repository should facilitate an on-going metrics collection effort, as well as serving as the “corporate memory” of past projects, their histories and experiences. In order to address these issues, we describe a transformational view of software development that treats the software development process as a series of artifact transformations. Each transformation has inputs and produces outputs. The use of this approach supports a very flexible software engineering metrics repository.


Software Process: Improvement and Practice | 2000

Coordinating models and metrics to manage software projects

David Raffo; Warren Harrison; Joseph Vandeville

In previous work we developed techniques for modeling software development processes quantitatively in terms of development cost, product quality, and project schedule using simulation. This work has predominately been applied to the software project management planning function. The goal of our current work is to develop a ‘forward-looking’ approach that integrates metrics with simulation models of the software development process in order to support the software project management controlling function. This ‘forward-looking’ approach provides predictions of project performance and the impact of various management decisions. It can be used to assess the projects conformance to planned schedule and resource consumption. This paper reports on work with a leading software development firm to create an approach that includes a flexible metrics repository and a discrete event simulation model. Copyright


IEEE Software | 1990

Tools for multiple-CPU environments

Warren Harrison; B. Kramer; W. Rudd; S. Shatz; C. Chang; Z. Segall; D. Clemmer; J. Williamson; B. Peek; B. Appelbe; K. Smith; A. Kolawa

A brief overview precedes ten separate tool reviews. Five of the tools address the problems of performance analysis, testing, and debugging in a multiple-CPU environment. The first set of tools-Graspin PPSE, and Integral-supports this activity by providing specification or design languages for concurrent applications. The next pair of tools-Pie and Total-supports the development of multiple-CPU software by representing the softwares behavior in a parallel or concurrent environment. The next set of five tools is aimed at the problem of serial-to-parallel conversions. The first three tools-E/SP, Mimdizer, and PRETS-recapture the design of the original source code and display it in a graphical form for analysis. The remaining tools-Pat and Aspar-support direct source-to-source transformations. These ten tools are representative of current approaches being taken to address the problem of multiple-CPU computing.<<ETX>>


annual software engineering workshop | 2002

A software engineering lessons learned repository

Warren Harrison

Most software organizations possess a large, but informal, corporate memory. This corporate memory is comprised of the experiences of every software engineer and manager, yet it is informal because there is seldom an institutionalized mechanism for disseminating the wisdom. In order to exploit this informal corporate memory, the key points of each experience can be placed into a repository for later dissemination. We describe a lessons learned repository (LLR) that facilitates such dissemination.


Software Quality Journal | 1999

Technology Review: Adapting Financial Measures: Making a Business Case for Software Process Improvement

Warren Harrison; David Raffo; John W. Settle; Nancy Eickelmann

Software firms invest in process improvements in order to benefit from decreased costs and/or increased productivity sometime in the future. Such efforts are seldom cheap, and they typically require making a business case in order to obtain funding. We review some of the main techniques from financial theory for evaluating the risk and returns associated with proposed investments and apply them to process improvement programs for software development. We also discuss significant theoretical considerations as well as robustness and correctness issues associated with applying each of the techniques to software development and process improvement activities. Finally we introduce a present value technique that incorporates both risk and return that has many applications to software development activities and is recommended for use in a software process improvement context.

Collaboration


Dive into the Warren Harrison's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

David Raffo

Portland State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Barry W. Boehm

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Bart Massey

Portland State University

View shared research outputs
Top Co-Authors

Avatar

Bruce Gifford

Portland State University

View shared research outputs
Top Co-Authors

Avatar

John W. Settle

Portland State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mark Morrissey

Portland State University

View shared research outputs
Researchain Logo
Decentralizing Knowledge