Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Henry D. Shapiro is active.

Publication


Featured researches published by Henry D. Shapiro.


Siam Journal on Scientific and Statistical Computing | 1985

On Minimizing a Set of Tests

Bernard M. E. Moret; Henry D. Shapiro

Minimizing the size or cost of a set of tests without losing any discrimination power is a common problem in fault testing and diagnosis, pattern recognition, and biological identification. This problem, referred to as the minimum test set problem, is known to be NP-hard, so that determining an optimal solution is not always computationally feasible. Accordingly, researchers have proposed a number of heuristics for building approximate solutions, without, however, providing an analysis of their performance. In this paper, we take an in-depth look at the main heuristics and at the optimal solution methods, both from a theoretical and an experimental standpoint. We characterize the worst-case behavior of the heuristics and discuss their use in bounding. We then present the results of extensive experimentation with randomly generated problems. While the exponential explosion suggested by the problems NP-hardness is apparent, our results suggest that real world testing problems of large sizes can be solved quickly at the expense of large storage requirements.


SIAM Journal on Discrete Mathematics | 1993

An exact characterization of greedy structures

Paul Helman; Bernard M. E. Moret; Henry D. Shapiro

The authors present exact characterizations of structures on which the greedy algorithm produces optimal solutions. Our characterization, which are called matroid embeddings, complete the partial characterizations of Rado [A note on independent functions, Proc. London Math. Soc., 7 (1957), pp. 300–320], Gale [Optimal assignments in an ordered set, J. Combin. Theory, 4 (1968), pp. 176–180], and Edmonds [Matroids and the greedy algorithm, Math. Programming, 1 (1971), pp. 127–136], (matroids), and of Korte and Lovasz [Greedoids and linear object functions, SIAM J. Alg. Discrete Meth., 5 (1984), pp. 239–248] and [Mathematical structures underlying greedy algorithms, in Fundamentals of Computational Theory, LNCS 177, Springer-Verlag, 1981, pp. 205–209] (greedoids). It is shown that the greedy algorithm optimizes all linear objective functions if and only if the problem structure (phrased in terms of either accessible set systems or hereditary languages) is a matroid embedding. An exact characterization of the ...


workshop on algorithms and data structures | 1991

An empirical analysis of algorithms for constructing a minimum spanning tree

Bernard M. E. Moret; Henry D. Shapiro

We compare algorithms for the construction of a minimum spanning tree through largescale experimentation on randomly generated graphs of different structures and different densities. In order to extrapolate with confidence, we use graphs with up to 130,000 nodes (sparse) or 750,000 edges (dense). Algorithms included in our experiments are Prims algorithm (implemented with a variety of priority queues), Kruskals algorithm (using presorting or demand sorting), Cheriton and Tarjans algorithm, and Fredman and Tarjans algorithm. We also ran a large variety of tests to investigate low-level implementation decisions for the data structures, as well as to enable us to eliminate the effect of compilers and architectures.


Journal of Universal Computer Science | 2001

Algorithms and Experiments: The New (and Old) Methodology

Bernard M. E. Moret; Henry D. Shapiro

The last twenty years have seen enormous progress in the design of algorithms, but little of it has been put into practice. Because many recently developed algorithms are hard to characterize theoretically and have large running_time coefficients, the gap between theory and practice has widened over these years. Experimentation is indispensable in the assessment of heuristics for hard problems, in the characterization of asymptotic behavior of complex algorithms, and in the comparison of competing designs for tractable problems. Implementation, although perhaps not rigorous experimentation, was characteristic of early work in algorithms and data structures. Donald Knuth has throughout insisted on testing every algorithm and conducting analyses that can predict behavior on actual data, more recently, Jon Bentley has vividly illustrated the difficulty of implementation and the value of testing. Numerical analysts have long understood the need for standardized test suites to ensure robustness, precision and efficiency of numerical libraries. It is only recently, however, that the algorithms community has shown signs of returning to implementation and testing as an integral part of algorithm development. The emerging disciplines of experimental algorithmics and algorithm engineering have revived and are extending many of the approaches used by computing pioneers such as Floyd and Knuth and are placing on a formal basis many of Bentleys observations. We reflect on these issues, looking back at the last thirty years of algorithm development and forward to new challenges: designing cache_aware algorithms, algorithms for mixed models of computation, algorithms for external memory, and algorithms for scientific research.


Journal of Scientific Computing | 1999

Locating Discontinuities of a Bounded Function by the Partial Sums of Its Fourier Series

George Kvernadze; Thomas Hagstrom; Henry D. Shapiro

A key step for some methods dealing with the reconstruction of a function with jump discontinuities is the accurate approximation of the jumps and their locations. Various methods have been suggested in the literature to obtain this valuable information. In the present paper, we develop an algorithm based on asymptotic expansion formulae obtained in our earlier work. The algorithm enables one to approximate the locations of discontinuities and the magnitudes of jumps of a bounded function given its truncated Fourier series. We investigate the stability of the method and study its complexity. Finally, we consider several numerical examples in order to emphasize strong and weak points of the algorithm.


Algorithmica | 1991

Heuristics for rapidly four-coloring large planar graphs

Craig A. Morgenstern; Henry D. Shapiro

We present several algorithms for rapidly four-coloring large planar graphs and discuss the results of extensive experimentation with over 140 graphs from two distinct classes of randomly generated instances having up to 128,000 vertices. Although the algorithms can potentially require exponential time, the observed running times of our more sophisticated algorithms are linear in the number of vertices over the range of sizes tested. The use of Kempe chaining and backtracking together with a fast heuristic which usually, but not always, resolves impasses gives us hybrid algorithms that: (1) successfully four-color all our test graphs, and (2) in practice run, on average, only twice as slow as the well-known, nonexact, simple to code, Θ(n) saturation algorithm of Brélaz.


Computers & Mathematics With Applications | 2000

Detecting the singularities of a function of Vp class by its integrated Fourier series

George Kvernadze; Thomas Hagstrom; Henry D. Shapiro

In the present paper, we pursue the general idea suggested in our previous work. Namely, we utilize the truncated Fourier series as a tool for the approximation of the points of discontinuities and the magnitudes of jumps of a 2π-periodic bounded function. Earlier, we used the derivative of the partial sums, while in this work we use integrals. First, we obtain new identities which determine the jumps of a 2π-periodic function of Vp, 1 ≤ p < 2, class, with a finite number of discontinuities, by means of the tails of its integrated Fourier series. Next, based on these identities we establish asymptotic expansions for the approximations of the location of the discontinuity and the magnitude of the jump of a 2π-periodic piecewise smooth function with one singularity. By an appropriate linear combination, obtained via integrals of different order, we significantly improve the accuracy of the initial approximations. Then, we apply Richardsons extrapolation method to enhance the approximation results. For a function with multiple discontinuities we use simple formulae which “eliminate” all discontinuities of the function but one. Then we treat the function as if it had one singularity. Finally, we give the description of a programmable algorithm for the approximation of the discontinuities, investigate the stability of the method, study its complexity, and present some numerical results.


Ecological Modelling | 1997

An artificial life approach to host-parasite interactions

Patricia G. Wilber; Henry D. Shapiro

Abstract A spatially-explicit, individual-based model simulating host-parasite interactions between Townsends ground squirrels ( Spermophilus townsendii ) and two of its parasites (an eimerian (Protozoa: Apicomplexa) and a helminth) was developed to assess the interactions of multiple factors (host immunity, parasite life-history, weather, and chance) on this system. Most functions in the model are driven by probabilities rather than deterministic equations. The model results corroborate observations from real squirrel populations and suggest that systems may be quite stable at high host densities, but that equilibrium may be unattainable at host densities often found in real systems because the equilibrium values change as conditions change. Altering the start-up parameters affected which mechanisms most strongly influenced the system—chance events were very important, especially at low host densities, suggesting that long-term studies are needed to fully understand year-to-year variation. Parasite life-history strategies had more influence on model outcome than the other parameters.


New Results and New Trends in Computer Science | 1991

How to Find a Minimum Spanning Tree in Practice

Bernard M. E. Moret; Henry D. Shapiro

We address the question of theoretical vs. practical behavior of algorithms for the minimum spanning tree problem. We review the factors that influence the actual running time of an algorithm, from choice of language, machine, and compiler, through low-level implementation choices, to purely algorithmic issues. We discuss how to design a careful experimental comparison between various alternatives. Finally, we present some results from an ongoing study in which we are using: multiple languages, compilers, and machines; all the major variants of the comparison-based algorithms; and eight varieties of graphs with sizes of up to 130,000 vertices (in sparse graphs) or 750,000 edges (in dense graphs).


technical symposium on computer science education | 1980

The results of an informal study to evaluate the effectiveness of teaching structured programming

Henry D. Shapiro

An informal experiment to evaluate the effectiveness of teaching structured programming from the beginning of a students career as a computer science student was conducted during the summer session of The University of New Mexico. Performed at the last minute, and clearly unscientific in both design and implementation, the results suggest that too great an emphasis on programming without GOTOs leads to poor programming style and incorrect implementation of straightforward algorithms.

Collaboration


Dive into the Henry D. Shapiro's collaboration.

Top Co-Authors

Avatar

Bernard M. E. Moret

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul Helman

University of New Mexico

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge