Ronald J. Leach
Howard University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ronald J. Leach.
Journal of Software: Evolution and Process | 1990
Ronald J. Leach
The term “software crisis” refers to the huge amount of resources needed for the development and maintenance of software. A major problem with research in these areas is the lack of solid data. In particular, there is little data that can be used to predict the types of problems that are likely to occur during the softwares maintenance. This paper describes the results of an analysis of a software system that underwent several revisions. Maintenance of the system was performed by distinct programming teams. At each revision of the software system, the analysis was performed by doing measurements of module complexity based on three quantities: Halsteads Software Science Effort, McCabes cyclomatic complexity, and coupling analyses. The relationships of Software Science Effort to unchangeability of modules, of cyclomatic complexity to path coverage, and coupling analysis to measurement of modularity and coherence during the maintenance are given.
Complex Variables and Elliptic Equations | 1984
Douglas M. Campbell; Ronald J. Leach
There are five sections in this survey paper. In section 1, we indicate why multiplier problems are of some interest. In section 2, we discuss the phenomenon that two distinct spaces of functions can have the same multipliers and relate this to a type of reflexivity. In section 3, we give a concise summary of known results on multipliers of Hp and related spaces. In section 4, we consider the problem of effectively describing multipliers instead of simply characterizing multipliers abstractly. The fifth and final section closes with a list of open problems. The paper concentrates on complex analysis techniques for Hp multipliers of functions in the disc and does not attempt to survey the vast developments of Hp multipliers in that have been developed from the point of view of Fourier analysis and which are contained in the 1977 survey paper by R. Coifman and G. Weiss [CW]. For the ease of the reader we collect in Table 1 the notations that will be used without further reminder throughout the rest of this ...
technical symposium on computer science education | 1988
Ronald J. Leach
4 0
ACM Sigada Ada Letters | 1987
Ronald J. Leach
Many students have great difficulty understanding concurrent programming at anything but the most superficial level. In this paper, we describe some experience teaching concurrent programming in Ada and give some suggestions for implementing the ideas discussed here.
Information & Software Technology | 1998
Ronald J. Leach; Claude A. Charles; Kenrick Fagan; Thembi Kimbrough; Kai R. Thomas
We describe our experience with the use of free and inexpensive software tools to reduce the costs of re-engineering a moderately sized software system. The original system was developed in fortran and assembly language on a mainframe and was to be rewritten in c on a UNIX workstation. We concluded that a simple, inexpensive preliminary analysis can reduce re-engineering costs considerably, with an almost 20% saving realized in this project. The process discussed can be generalized to other application domains.
parallel computing | 1990
Ronald J. Leach; O.Michael Atogi; Razeyah R Stephen
We consider several sequential and parallel algorithms for the evaluation of polynomials of low degree, with particular emphasis on those that are used frequently in computer graphics. A complete accounting of computation times for the speedup and efficiency of these algorithms is reported. The results are compared to standard estimates of these quantities for single and multi-processors using classical complexity theory. A simulator which is configurable to several parallel architectures is used to provide validation of the results obtained.
technical symposium on computer science education | 1988
Ronald J. Leach; Jeffrey A. Brumfield; Michael B. Feldman; Charles M. Shub
Concurrency is a major trend in computer science; it can be taught from the point of view of operating systems, programming languages, algorithm design, database design, software engineering, systems engineering, and computer architecture. The panel will address the following questions among others: When should students be exposed to concurrency? In traditional or non-traditional courses? How many times? What must a programmer know about implementation? Must applications programmers now become experts in operating systems? What are the appropriate paradigms for development of concurrent programming in education?
technical symposium on computer science education | 2008
Ronald J. Leach
The ABET assessment process is highly complex and is emphasizes the use of assessment to improve programs. Since the process is complex, it has a considerable overhead. This paper will present some models for estimating the added overhead of such assessment. The models can be used to help determine if any other activities must be curtailed because of the increased overhead of assessment. The models suggest which categories of colleges and universities will be affected most adversely by the overhead of the ABET accreditation process.
technical symposium on computer science education | 2008
Ronald J. Leach; Legand L. Burge; Harry Keeling
A recent paper by David Lechner stated that for many long-lived systems, it is more efficient to reengineer portions of systems than to continually repair them. That paper made an implicit assumption about the ability of software engineers to determine precisely which software should be reengineered. We report the results of a study that addresses the readiness of graduates, who will soon be beginning software engineers, to make such an assessment, based on comprehension of reusable vs. reengineered software. We address this comprehension in the context of software engineering education.
Journal of Systems and Software | 2008
Ronald J. Leach
In this paper we describe the results of a study of the insertion of checkpoints within a legacy software system in the aerospace domain. The purpose of the checkpoints was to improve program fault-tolerance during program execution by rolling back system control to a saved state from which program execution can continue. The study used novice programmers for the determination of where the checkpoints were to be added. The focus was on the programmers understanding of the code, since this affected how the checkpoints were placed. The results should provide guidance to those interested in improving the fault-tolerance of legacy software systems, especially those written in older, nearly obsolescent programming languages.