Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Karen E. Huff is active.

Publication


Featured researches published by Karen E. Huff.


software engineering symposium on practical software development environments | 1989

A plan-based intelligent assistant that supports the software development

Karen E. Huff; Victor R. Lesser

We describe how an environment can be extended to support the <italic>process</italic> of software development. Our approach is based on the AI <italic>planning</italic> paradigm. Processes are formally defined hierarchically via plan operators, using multiple levels of abstraction. <italic>Plans</italic> are constructed dynamically from the operators; the sequences of actions in plans are tailored to the context of their use, and conflicts among actions are prevented. Monitoring of the development process, to detect and avert process errors, is accomplished by <italic>plan recognition</italic>; this establishes a context in which programmer-selected goals can be automated via <italic>plan generation</italic>. We also show how <italic>nonmonotonic reasoning</italic> can be used to make an independent assessment of the <italic>credibility</italic> of complex process alternatives, and yet accede to the programmers superior judgment. This extension to intelligent assistance provides deeper understanding of software processes.


international software process workshop | 1988

Probing limits to automation: towards deeper process models

Karen E. Huff

Recent progress in specifying and prescribing software development processes has demonstrated that automated support is an achievable goal. Automated support can, for example, prevent a programmer from starting compilations before an appropriate environment is set up, enforce a policy of regression and performance testing before a customer release, and insure that new source versions are checked back into the source code control system. Such support will be valuable to programmers, whose intellectual energies can be freed from many process details in order to be directed towards other pressing concerns; and, it will be. valuable to managers, who can be assured that certain project policies and conventions are routinely observed.


international software process workshop | 1989

Software process instantiation and the planning paradigm

Karen E. Huff

One of the challenges in finding a suitable paradigm for defining software processes relates to the highly dynamic nature of software development. The actions required to carry out a given process are often dependent on detailed aspects of the current state. For example, the modules that need to be archived are just those that have been changed since they were last archived; the choice of archival method (i.e., source code control system versus local backup copy) depends on whether the module is in a steady state or still rapidly changing; use of certain debugging features is ruled out unless compilations were done with optimization disabled; and, many quite diverse factors influence the choice of relevant test cases to be run against a particular system version. The fact that development takes place, in part, through trial-and-error means that failures occur frequently, and recovery from failure must be accommodated. In spite of the programmers best intentions, design reviews reveal areas needing rework, compilers find errors in source code, test cases fail, and tools sometimes exhibit bugs themselves. Powerful instantiation facilities represent one way to handle this type of dynamic behavior. A software project defines the process its programmers are to follow, and the actions taken by the programmers represent instantiations of this process. At an absolute minimum, instantiation involves binding parameters in the process definition to specific software objects in the environment. In this paper, we describe the relationship between instantiation facilities and the dynamics of process activities. First, we illustrate some limitations in instantiation mechanisms using a familiar paradigm: procedural definitions of processes. Then we describe a different approach to instantiation used in the AI planning paradigm.


international software process workshop | 1996

Process measurement through process modeling and simulation

Karen E. Huff

Measurement is an important part of the characterization of a process and the context in which it should be used and reused. We discuss our experience using process modeling to support process/product measurement. We briefly introduce APEX, a process modeling and simulation tool that embeds a Petri net representation in a discrete event simulation environment. Then, we discuss how process modeling and simulation with APEX supports experimentation with aspects of a measurement plan, and allows the evaluation and refinement of a plan prior to its deployment.


international software process workshop | 1991

Supporting Change In Plan-based Processes

Karen E. Huff

Change is problematic for process definition and enaction paradigms. The changes ,and their consequences may be complex, creating a ripple effect throughout the process definition. For example, a new configuration management strategy is likely to have pervasive consequences. The process being changed may not be in a quiescent state-active instances may be affected. What should be done about all the testing in progress when a testing strategy is changed? The problem of correlating process history (a record of the process steps executed:) with the process definition is harder if changes occur midstream. Testing or verification of changes is needed, but these are not settled issues for processes in the absence of change.


international software process workshop | 1990

On the Relationship Between Software Processes and Software Products

Karen E. Huff

A primary objective of having formal and enactable models of software processes is to extend software development environments to provide assistance to programmers following those processes. The GRAPPLE system [Huff and Lesser, 1988; Huff, 19891 implements one approach to intelligent assistance, where the programmer can retain control of executing a process or the system can take over to automate parts of the process. GRAPPLE provides error detection and correction, generation of summaries of tasks completed and agendas of tasks to do, as well as automation of tasks. A data structure, called a plan, is used to represent the rationale for and status of all activities in progress; these plans are incrementally elaborated and executed.


international software process workshop | 1989

GRAPPLE example: processes as plans

Karen E. Huff

As an example of a small part of the software process, consider the activities preliminary to building a new version of a software system; one task involves setting up an appropriate environment in which to apply the compilation and linking tools. This task can be done in several different ways. A common approach under UNIX#8482; entails collecting all the compilation units representing the baseline from which the new system version is to be developed into a “reference directory” and collecting all new compilation units unique to the new system version into a “working directory”. When two (or more) new system versions are being developed from the same baseline, they can share one reference directory, but each system version always needs its own working directory. Obviously, one directory cannot serve as both a reference and a working directory. In this approach, setting up an environment consists of a hierarchy of goals shown informally in the figure below. Setting up the reference directory and setting up the working directory break down into further subgoals: the programmer has to commit an appropriate directory, empty it of extraneous files, and fill it with the relevant files. These goals are accomplished with operating system commands to make directories (mkdir), copy files (copy), move files (move), and delete files (delete). The state schema for this example includes four types of entities: directories, files, contents of files, and systems. Predicates are defined to record that files are in directories, contents are stored in files, contents are of different kinds source, include, executable, etc), contents can be part of systems, contents can represent the executable forms of systems, etc. Directories can be used by systems as reference or working directories. The examples shows four characteristics that we believe are typical for processes. First, there is an obvious hierarchical structure. Second, there are many different action sequences that can be used to achieve a given goal. In some cases, there are several operators to choose from. For example, removing extraneous files can be achieved by delete or move, or combination thereof. In other cases, the actions chosen on the context—whether there are existing, uncommmited directories, or what the current placement of files in those directories is. Third, there is a high degree of interleaving of tasks. For example, actions to setup the reference directory can be intermixed with actions to setup the working directory, or actions from two separate setup-environment plans can be intermixed. Fourth, conflicts can arise that will destroy the effectiveness of the plans. If the reference directory is already setup, and the programmer is now setting up the working directories, then there are better ways of emptying the working directory of extraneous files than moving them into reference directory: this will destroy the “readiness” of the reference directory, meaning that more work will have to be done to reestablish “readiness”. Such a conflict leads, at best to an inefficient plan; if not detected, it leads to an improper plan (when the file is never subsequently moved out of the reference directory). The definitions given below are taken from an example that runs in the GRAPPLE plan-based process support system.


Archive | 1989

Plan-based intelligent assistance: an approach to supporting the software development process

Karen E. Huff; Victor R. Lesser


international symposium on computer and information sciences | 1982

Knowledge-based command understanding: an example for the software development environment

Karen E. Huff; Victor R. Lesser


Archive | 1987

The role of plan recognition in design of an intelligent user interface

Carol A. Broverman; Karen E. Huff; Victor R. Lesser

Collaboration


Dive into the Karen E. Huff's collaboration.

Top Co-Authors

Avatar

Victor R. Lesser

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Margaret E. Connell

University of Massachusetts Amherst

View shared research outputs
Top Co-Authors

Avatar

Lawrence S. Lefkowitz

University of Massachusetts Amherst

View shared research outputs
Researchain Logo
Decentralizing Knowledge