Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Frederick N. Springsteel is active.

Publication


Featured researches published by Frederick N. Springsteel.


Bit Numerical Mathematics | 1987

Alternative methods for the reconstruction of trees from their traversals

H. A. Burgdorff; Sushil Jajodia; Frederick N. Springsteel; Yechezkel Zalcstein

It is well-known that given the inorder traversal of a binary trees nodes, along with either one of its preorder or postorder traversals, the original binary tree can be reconstructed using a recursive algorithm. In this short note we provide a short, elegent, iterative solution to this classical problem.


International Journal of Parallel Programming | 1990

Parallel general prefix computations with geometric, algebraic, and other applications

Frederick N. Springsteel; Ivan Stojmenovic

We introduce a generic problem component that captures the most common, difficult “kernel” of many problems. This kernel involves general prefix computations (GPC). GPCs lower bound complexity of Ω(n logn) time is established, and we give optimal solutions on the sequential model inO(n logn) time, on the CREW PRAM model inO(logn) time, on the BSR (broadcasting with selective reduction) model in constant time, and on mesh-connected computers inO(√n) time, all withn processors, plus anO(log2n) time solution on the hypercube model. We show that GPC techniques can be applied to a wide variety of geometric (point set and tree) problems, including triangulation of point sets, two-set dominance counting, ECDF searching, finding two-and three-dimensional maximal points, the reconstruction of trees from their traversals, counting inversions in a permutation, and matching parentheses.


integrating technology into computer science education | 1996

Evaluation: turning technology from toy to tool: report of the working group on evaluation

Vicki L. Almstrum; Nell B. Dale; Anders Berglund; Mary J. Granger; Joyce Currie Little; Diane M. Miller; Marian Petre; Paul Schragger; Frederick N. Springsteel

Evaluation is an educational process, not an end in itselfi we learn in order to help our students learn. This paper presents a pragmatic perspective on evaluation, viewing it as a matter of trade-offs. The space of possible evaluation approaches is analysed in terms of trade-offs among desired evidence, costs, and other constraints. This approach is illustrated with example scenarios and a list of selected resources is provided. Aim of the Working Group This working group set out to consider how pragmatic, empirical evaluation can be used to harness technology for teaching Computer Science and Information Systems. Educators reject the tendency to adopt ‘technology for technology’s sake’ and want to analyze technology in terms of its suitability for a teaching purpose and its impact—both costs and benefits—on teaching practice and outcomes. The question is not ‘Can we use technology in teaching?’, but ‘Can we use technology to enhance teaching and improve learning?’ Empirical evaluation and technology can form a powerful partnership to enhance teaching purposefully and usably. The working group explored the parameters of an effective partnership. Introduction Computer Science and Information Systems (CS/IS) are rife with examples of technology-driven projects that fail to address fundamental issues, with systems designed by introspection, with software evaluated by market share alone, with good ideas neglected after poor initial implementations. Evaluation is often Permission to make digitalmard copy of part or atl of this work for personal or classroom use is ranted without fee provided that copies are not made f or distributed for pro d or wmmercial advantage, the copyright notice, the titte of the publication and its date appear, and notice is given that copying is by permission of ACM, Inc. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Integrating Tech. into C.S.E. 6/96 Barcelona, Spain 01996 ACM 0-89791 -844-4/96/0009 ...


Theoretical Computer Science | 1979

Complexity in mechanized hypothesis formation

Frederick N. Springsteel

3.50 Diane M. Miller University of Southern Mississippi, USA dmmiller@medea. gp.usm.edu Marian Petre (joint chair) Open University, UK [email protected] Paul Schragger Villanova University, USA schragge@monet. vilI.edul Fred Springsteel University of Missouri, USA csfreds @mizzoul .missouri.edu seen as an expensive, time-consuming, esoteric process with little practical relevance. But principled, practical evaluation— empirical study of actual practice, perhaps within a tightly focused question or a particular task—can identify crucial issues, debunk inappropriate folklore, give substance to intuition, disambiguate causes, and make the difference between failure and success. The introduction of new technologies increases the importance of evaluation in order to untangle the snarl of factors and influences that impinge on how technology is used in context. Unless educational technology can address educational objectives, the ‘nifty’ ideas it encompasses are no more than fashion. Evaluators need to base their analyses and designers neecl to base thleir designs on real practice; not everything that is ‘intuitive’ or ‘sexy’ is appropriate within real teaching environments. Evaluation offers a means of putting technology into perspective, so that it is viewed as a tool for addressing real problems—a means, rather than an end in itself. Technology as toy and tool The current leading-edge technologies, such as videoconferencing, multi-media, software vi sualizatiou, and Internetenabled applications (World Wide Web, electronic mail, bulletin board systems, etc.), are perceived to have immediate potential for use as educational tools. However, it is all too easy to mis-aplply these technologies, using them as flashy toys or interesting playthings. Technology-led adoption follows a ‘we have it—let’s use it’ enthusiasm. But that can be a blind alley for evaluation: often the need for an answer expires before we have a chance to ask the question. We should pursue an education-led deliberation: ‘We have it—but is it appropriate for this purpose?’ Technology remains a toy when it is used merely because it is attractive and exciting, but its real potential is unexplored. Technology is often introduced into education to attract and excite, without any more than an assumption that it might be useful. But, if applied without deliberative study of its use in context and without the evaluation of the technology’s impact on this use, ‘educational’ technology remains a toy.


International Journal of Parallel Programming | 1984

Entity-relationship diagrams which are in BCNF

Sushil Jajodia; Peter A. Ng; Frederick N. Springsteel

Abstract Practical questions of computability are studied for a special mechanized method designed to suggest scientific hypotheses on the basis of sampled data: the so-called GUHA method. In simplified terms our GUHA system accepts particular data as a binary (or, binary plus “x”: unknown) input matrix, which relates objects in the sample to a common set of yes-or-no properties. It seeks to output factual (non-tautologous) formal sentences, which are true in the data and so yield general hypotheses for the universe of all such objects. This paper is the first detailed analysis of the algorithmic complexity of this type of system, by considering the time (number of steps) needed to solve its basic decision problem: whether some factual sentence, of various pre-specified forms, will be so output. The resulting time bounds are functions of changeable input size and give minima for overall system complexity. In fact, when judged by the norm for efficient computability of polynomial-time, we present here both some positive and some closely related “negative” results: e.g. the distinction between P-time and NP-completeness (usually considered to be exponential time) often depends only on being given binary or ternary data, the basic question being existence of a true elementary disjunction. Quite similar results are true for sentences with either classical or non-classical (statistically motivated) quantifiers. Moreover, some closely related two-valued problems, involving input parameters to bound desired sentence length, resisted all our efforts to place them as P-time or NP-complete and have an apparently intermediate complexity. At least they are concrete candidates for the (theoretical) hierarchy of P-reducible degrees between P and NP (assuming P ≠ NP.


technical symposium on computer science education | 1999

The peer review process of teaching materials: report of the ITiCSE'99 working group on validation of the quality of teaching materials

Deborah Knox; Don Goelman; Sally Fincher; James Hightower; Nell B. Dale; Ken Loose; Elizabeth S. Adams; Frederick N. Springsteel

In Ref. 8, we introduced a simplifying assumption about entity-relationship diagrams (ERDs), called regularity, and showed that regular ERDs have several desirable properties. One such property is that every relation schema in the ERDs canonical relational scheme can be put into Third Normal Form. We left open there the more basic question: under what conditions would the original relation schemas actually be in Boyce-Codd Normal Form (BCNF)? Since the visible semantics of ERDs determine naturally their associated functional dependencies (fds), it is important to know when an ERD, as designed, already has this strongest normal form given purely in terms of fds. We show here a sufficient diagrammatic condition (loop-free) under which a regular ERD will have databases enjoying the benefits of BCNF.


International Journal of Human-computer Studies \/ International Journal of Man-machine Studies | 1981

Complexity of hypothesis formation problems

Frederick N. Springsteel

When an instructor adopts teaching materials, he/she wants some measure of confidence that the resource is effective, correct, and robust. The measurement of the quality of a resource is an open problem. It is our thesis that the traditional evaluative approach to peer review is not appropriate to insure the quality of teaching materials, which are created with different contextual constraints. This Working Group report focuses on the evaluation process by detailing a variety of review models. The evolution of the development and review of teaching materials is outlined and the contexts for creation, assessment, and transfer are discussed. We present an empirical study of evaluation forms conducted at the ITiCSE 99 conference, and recommend at least one new review model for the validation of the quality of teaching resources.


database and expert systems applications | 1993

Object-based Schema Integration for Heterogeneous Databases: A Logical Approach

Frederick N. Springsteel

This paper reviews some results concerning the computational complexity of the processes of mechanizing hypothesis formation in the framework of GUHA methods.


Information & Software Technology | 1992

Reverse data engineering technology for visual database design

Frederick N. Springsteel; C. Kou

Given the large number of conceptual schema design tools and methodologies in use today, conversion efforts need strategies by which any source schemas can be easily integrated in heterogeneous database systems. We discuss the design and analysis of strategies for multiple schema/view integration with an extensible redesign system that is object-based, as defined herein. Our focus is on well-defined, clearly identifiable objects; E-R schemas describable in linear form, or in graphical (E-R Diagram) form with annotations. E-R schemas are a composite class of basic objects: E-sets (of entities) and R-sets (of relationships), each of which is extended. E-sets are described by structured sets of keys, lists of attributes (with domains), indexes, and textual constraints. R-sets are structured lists of E-set names with a description of their participation in the R-set [“one” or “many”], and textual constraints, e.g. foreign keys. We propose a design for the missing step: a Graphic Schema Algebra to merge various conceptual schemas, to provide a leverage point for interoperation.


hawaii international conference on system sciences | 1988

Entity-relationship logical design of database systems: relational normal forms and extended regular ERDs

Frederick N. Springsteel

Abstract The data engineers inverse mapping problem of constructing, from a relational database schema (RDBS), a corresponding entity-relationship diagram (ERD), is studied. For going from extended ERDs to RDBS, there is a well known translation algorithm. The inverse direction is more difficult because many ERDs may correspond to one RDBS, and so it is not clear how to choose a most representative ERD. None the less, it would be desirable to be able to do so whenever possible, so that the benefits of entity-relationship (E-R)-based visual design can be applied to any RDBS. The paper presents a first demonstration prototype for the inverse mapping problem, for the case of an RDBS that was originally designed by an E-R-based tool and was then altered. A reverse translator (ReTro-ERDDS) tracks each atomic change in the RDBS and determines, via its Prolog E-R knowledge base, the corresponding changes in the original ERD, while preserving well formedness, a guarantor of normalization and robustness.

Collaboration


Dive into the Frederick N. Springsteel's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nell B. Dale

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Peter A. Ng

New Jersey Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

C. Kou

University of Missouri

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge