Terry Shepard
Royal Military College of Canada
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Terry Shepard.
Communications of The ACM | 2001
Terry Shepard; Margaret Anne Lamb; Diane Kelly
Terry Shepard, Margaret Lamb, and Diane Kelly If testing is broadened to include all of Verification and Validation (V&V), the situation is even more serious. Highly effective practices such as software inspection [5] are hardly taught at all, and many computer science professors do not know (or care) what inspection is and why it is valuable. With the current pace of software development, V&V techniques must become ever more efficient and effective. Students today are not well equipped to apply widely practiced techniques, and have even further to go to understand techniques that could improve current practice. They are graduating with a serious gap in the knowledge they need to be effective software developers. Even new software engineering curricula tend to be weak in V&V. We describe the teaching of testing in an undergraduate software specialization in a computer engineering curriculum and in a graduate course on V&V, and we compare the two different approaches. We believe the equivalent of several courses on testing, software quality, and the broader issues of V&V should be available to undergraduates, and should be mandatory for software engineers. The undergraduate curriculum we describe is moving in this direction by including V&V material in several courses. At the graduate level, particularly for students who already have industrial experience, a significant part of the material can be concentrated into a single course, sacrificing depth in favor of breadth. Software V&V techniques are part of the larger discipline of software engineering. Software engineering is an emerging discipline, first identified more than 30 years ago [9]. Today, most undergraduate computing curricula are based on the 1991 ACM/IEEE Computing Curriculum (www.acm.org/education/curricula.html). It sets a minimum of only eight lecture hours to cover the whole of V&V, including reviews, testing, and elementary proofs of correctness. This situation is only slightly improved in the draft Curriculum 2001 (www.acm.org/sigcse/cc2001/): there are fewer hours (six) suggested in the core for software validation; formal proofs of correctness are gone, and inspection has been added. More recently, curricula in software engineering (as opposed to computer science) have started to appear. There is not yet a definitive source for guidance on such curricula. One source is the 1999 Guidelines for Software Engineering Education, Version 1.0 (www.sei.cmu.edu). While there is more attention Testing typically takes 50% or more of the resources for software development projects. Curi-
Journal of Systems and Software | 2004
Diane Kelly; Terry Shepard
Software inspection is recognized as an effective verification technique. Despite this fact, the use of inspection is surprisingly low. This paper describes a new inspection technique, called task-directed inspection (TDI), and a light-weight process, that were used to introduce inspection in a particular industrial environment. This environment had no history of inspections, was resistant to the idea of inspection, but had a situation where confidence in a safety-related legacy suite of software had to be increased. The characteristics of TDI are explored. They give rise to a variety of approaches that may encourage more widespread use of inspections. This paper examines the industrial exercise as a case study, with the intent that it be useful in other situations that share characteristics with the situation described.
Software Testing, Verification & Reliability | 2004
Diane Kelly; Terry Shepard
Software inspections are an intensely people‐centric activity. Even though this is routinely recognized in industry, much of the research focuses on inspection mechanics. During three years of inspection experiments, even though the main purpose of the experiments was to investigate the effectiveness of a particular technique, the inspectors involved provided broad comments on many other aspects of inspections. Their comments were collected and organized into themes. These themes are presented here as a set of maxims that cover all the topics that the inspectors felt were important as they endeavoured to do good inspections. Copyright
Lecture Notes in Computer Science | 2003
Ross McKegney; Terry Shepard
In this paper, we consider interface contracts as a possible mechanism for improving semantic integrity in component-based systems. A contract is essentially a formal specification interleaved with code and allowing a component or object to unambiguously specify its behaviour. The existing techniques that we survey are predominantly designed for object-oriented systems; we therefore investigate the extent to which they can be scaled up to the level of components, and embedded in interface specifications rather than code. We conclude that interleaved specifications are viable and useful at the level of components, but that future work is required to develop languages that can express the constraints that are important at this level of granularity.
Software - Practice and Experience | 1992
W. Morven Gentleman; Terry Shepard; Douglas V. P. Thoreson
This paper discusses rendezvous on multiprocessors. Three different approaches are compared, represented by three specific systems: Ada, Harmony and BNR Pascal. All three permit tasks to run on multiple processors and use blocking communications primitives, but there are significant differences. For example, control over replying to messages out of sequence and over the allocation of tasks to processors is omitted in Ada, but is available in Harmony. The approach represented by BNR Pascal follows a middle road between Harmony and Ada: a low level protocol, invisible to the programmer, is used to ensure communications reliability, but the programmer is aware of when a rendezvous is remote. If performance considerations and verbosity and robustness are ignored, all three approaches are equivalent. To illustrate this equivalence, and to demonstrate clearly the complexity of the Ada rendezvous, an Ada rendezvous administrator written using Harmony is described. A second method of adapting Harmony to Ada is also presented, in which the Harmony primitives are modified to be closer to Ada. In practice, using Harmony primitives directly will usually result in better programs. It is argued that something very much like the rendezvous adminstrator is needed for any actual implementation of the Ada rendezvous.
sei conference on software engineering education | 1995
Terry Shepard
A graduate course on software verification and validation (VV approaches to testing; formal methods of verification; and techniques for ensuring that software is trustworthy (more reliable than can be measured). This makes it challenging to decide what to include in a single course. Justifications are given for some of the choices made. Choices of material for undergraduate curricula are even more difficult to make. Some suggestions are offered in this area. Experience with courses on V&V is relatively thin, and the basis for teaching much of the material is rapidly evolving, so stable widely used curricula are still some time away.
international conference on software engineering | 2001
Terry Shepard
There is increasing urgency to put software engineering (SE) programs in place at universities in North America. For years, the computer science and professional engineering communities neglected the area, but both are now paying serious attention. There is creative tension as efforts accelerate to define the field faster than is possible. This paper discusses a set of four software degree programs that have evolved over 14 years at a small university with close ties to one software community. The context is computer engineering in a department of electrical and computer engineering, so the natural domain is software that is close to the hardware. This means an emphasis on real-time, embedded, and, to a lesser extent, safety critical issues. The newest of the four programs is a Ph.D. program. It demonstrates that Ph.D. programs can be created with limited resources, given the right circumstances. If similar circumstances exist in other small universities, the rate of Ph.D. production in software engineering may be able to be increased, while maintaining quality. This paper describes the four degree programs, how they are related to each other, and how the programs have evolved. It makes limited comparisons to programs at other universities.
sei conference on software engineering education | 1992
Daniel Hoffman; Terry Shepard
In 1986, a rational design process for software was proposed [1]. This paper reports on experience teaching a course based on this process in an undergraduate computer engineering curriculum, with a companion course at the graduate level helping to feed the undergraduate course. Because the courses are for computer engineers, the emphasis is on real-time systems. The main issue is how to simplify the details of the steps so that all the steps can be learned and applied within the bounds of a one semester course. The culmination of the two courses is a project in which all the steps are used to create a working piece of software. In this paper, the steps as taught are described, with the aid of examples, and then some of the issues influencing the particular choice of material taught are discussed.
IEEE Software | 2004
Diane Kelly; Terry Shepard
Software engineering is still a young discipline. Software development group managers must keep their groups current with this dynamic body of knowledge as it evolves. There are two basic approaches: require staff to have both application expertise and software expertise, or create a software cell. The latter approach runs the risk of two communities not communicating well, although it might make staying abreast of changes in software engineering easier. The first approach should work better than it does today if some new educational patterns are put in place. For example, we could start treating software more like mathematics, introducing more software courses into undergraduate programs in other disciplines. Managers must also focus on the best way to develop software expertise for existing staff. Staff returning to school for a masters in software engineering can acquire a broad understanding of the field, but at a substantial cost in both time and effort. Short courses call help to fill this gap, but most short courses are skill based, whereas a deeper kind of learning is needed. As the first step, however, managers must assess softwares impact on their bottom line deliverables. It might surprise them how much they depend on software expertise to deliver their products.
conference on object-oriented programming systems, languages, and applications | 2000
Ross McKegney; Terry Shepard
Real-time object-oriented modeling tools (e.g. Rational Rose-RT, i-Logix Rhapsody) allow developers to focus on software architecture by abstracting away low-level implementation details. We believe that design patterns can be very beneficial in this context, and present the rationale and concepts behind a proposal for the extension of such a toolset to support them explicitly.