Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter J. Denning is active.

Publication


Featured researches published by Peter J. Denning.


Communications of The ACM | 1983

A nation at risk: the imperative for educational reform

Peter J. Denning

released a remarkab le report, A Nation at Risk. This Report has s t imulated in the media considerable discussion about the problems in our schools, speculation about the causes, and ass ignment of blame. Astonishingly, f e w of the media reports have focused on the specific f indings and recommendat ions of the Commission. A lmos t none of the med ia reports tells that the Commission i tsel f re frained f rom speculation on causes and f rom assignment of blame. Because of the extraordinary clarity and importance of the Commissions Report, the editors of the Communica t ions decided to reprint the Reports main section in its entirety. We are p leased to present it to you here.


Communications of The ACM | 1977

Certification of programs for secure information flow

Dorothy E. Denning; Peter J. Denning

ertification mechanism for verifying the secure flow of information through a program. Because it exploits the properties of a lattice structure among security classes, the procedure is sufficiently simple that it can easily be included in the analysis phase of most existing compilers. Appropriate semantics are presented and proved correct. An important application is the confinement problem: The mechanism can prove that a program cannot cause supposedly nonconfidential results to depend on confidential input data.


ACM Computing Surveys | 1970

Virtual memory

Peter J. Denning

Virtual memory is the simulation of a storage space so large that programmers do not need to reprogram or recompile their works when the capacity of a local memory or the configuration of a network changes. The name, borrowed from optics, recalls the virtual images formed by mirrors and lenses--images that are not there but behave as if they are. The designers of the Atlas Computer at the University of Manchester invented virtual memory in the 1950s to eliminate a looming programming problem: planning and scheduling data transfers between main and secondary memory and recompiling programs for each change of size of main memory. Virtual memory is even more useful in the computers of the 1990s, which have more things to hide-on-chip caches, separate RAM chips, local disk storage, network file servers (q.v.), large numbers of separately compiled program modules, other computers on the local bus or local network, or the Internet. The story of virtual memory from then to now is a story about machines helping programmers solve problems in storage allocation, protection of information, sharing and reuse of objects, and linking of program components. Virtual memory, common in all computers and operating systems from the smallest microprocessor to the largest supercomputer, is now invading the Internet.


ACM Computing Surveys | 1978

The Operational Analysis of Queueing Network Models

Peter J. Denning; Jeffrey P. Buzen

Queueing network models have proved to be cost effectwe tools for analyzing modern computer systems. This tutorial paper presents the basic results using the operational approach, a framework which allows the analyst to test whether each assumption is met in a given system. The early sections describe the nature of queueing network models and their apphcations for calculating and predicting performance quantitms The basic performance quantities--such as utilizations, mean queue lengths, and mean response tunes--are defined, and operatmnal relationships among them are derwed Following this, the concept of job flow balance is introduced and used to study asymptotic throughputs and response tunes. The concepts of state transition balance, one-step behavior, and homogeneity are then used to relate the proportions of time that each system state is occupied to the parameters of job demand and to dewce charactenstms Efficmnt methods for computing basic performance quantities are also described. Finally the concept of decomposition is used to stmphfy analyses by replacing subsystems with equivalent devices. All concepts are illustrated liberally with examples


IEEE Transactions on Software Engineering | 1980

Working Sets Past and Present

Peter J. Denning

A programs working set is the collection of segments (or pages) recently referenced. This concept has led to efficient methods for measuring a programs intrinsic memory demand; it has assisted in undetstanding and in modeling program behavior; and it has been used as the basis of optimal multiprogrammed memory management. The total cost of a working set dispatcher is no larger than the total cost of other common dispatchers. This paper outlines the argument why it is unlikely that anyone will find a cheaper nonlookahead memory policy that delivers significantly better performance.


ACM Computing Surveys | 1979

Data Security

Dorothy E. Denning; Peter J. Denning

The rising abuse of computers and increasing threat to personal privacy through data banks have stimulated much interest in the technical safeguards for data. There are four kinds of safeguards, each related to but distinct from the others. Access controls regulate which users may enter the system and subsequently which data sets an active user may read or write. Flow controls regulate the dissemination of values among the data sets accessible to a user. Inference controls protect statistical databases by preventing questioners from deducing confidential information by posing carefully designed sequences of statistical queries and correlating the responses. Statistical data banks are much less secure than most people believe. Data encryption attempts to prevent unauthorized disclosure of confidential information in transit or in storage. This paper describes the general nature of controls of each type, the kinds of problems they can and cannot solve, and their inherent limitations and weaknesses. The paper is intended for a general audience with little background in the area.


Communications of The ACM | 2005

The locality principle

Peter J. Denning

Locality of reference is a fundamental principle of computing with many applications. Here is its story.Locality is a universal behavior of all computational processes: They tend to refer repeatedly to subset of their resources over extended time intervals. System designers have exploited this behavior to optimize performance in numerous ways, which include caching, clustering of related objects, search engines, organizations of databases, spam filters, and forensics.


ACM Computing Surveys | 1976

Fault Tolerant Operating Systems

Peter J. Denning

This paper develops four related architectural principles which can guide the construction of error-tolerant operating systems. The fundamental principle, system closure, specifies that no action is permissible unless explicitly authorized. The capability based machine is the most efficient known embodiment of this principle: it allows efficient small access domains, multiple domain processes without a privileged mode of operation, and user and system descriptor information protected by the same mechanism. System closure implies a second principle, resource control, that prevents processes from exchanging information via residual values left in physical resource units. These two principles enable a third, decision verification by failure-independent processes. These principles enable prompt error detection and cost-effective recovery. Implementations of these principles are given for process management, interrupts and traps, store access through capabilities, protected procedure entry, and tagged architecture.


ACM Transactions on Database Systems | 1979

The tracker: a threat to statistical database security

Dorothy E. Denning; Peter J. Denning; Mayer D. Schwartz

The query programs of certain databases report raw statistics for query sets, which are groups of records specified implicitly by a characteristic formula. The raw statistics include query set size and sums of powers of values in the query set. Many users and designers believe that the individual records will remain confidential as long as query programs refuse to report the statistics of query sets which are too small. It is shown that the compromise of small query sets can in fact almost always be accomplished with the help of characteristic formulas called trackers. Schlörers individual tracker is reviewed; it is derived from known characteristics of a given individual and permits deducing additional characteristics he may have. The general tracker is introduced: It permits calculating statistics for arbitrary query sets, without requiring preknowledge of anything in the database. General trackers always exist if there are enough distinguishable classes of individuals in the database, in which case the trackers have a simple form. Almost all databases have a general tracker, and general trackers are almost always easy to find. Security is not guaranteed by the lack of a general tracker.


Communications of The ACM | 2005

Wikipedia risks

Peter J. Denning; Jim Horning; David Lorge Parnas; Lauren Weinstein

The Wikipedia (WP; en.wikipedia.org/wiki/) applies the wiki technology (from a Hawaiian word for “quick”) to the encyclopedia, a venerable form of knowledge organization and dissemination. Wikipedia provides a fast and flexible way for anyone to create and edit encyclopedia articles without the delay and intervention of a formal editor or review process. The WP’s over 750,000 articles are written and edited by volunteers. WP founder Jimmy Wales believes WP’s free, open, and largely unregulated process will evolve toward an Encyclopædia Britannica or better quality. But will this process actually yield a reliable, authoritative reference encompassing the entire range of human knowledge? Opinions are mixed. WP claims to be the most popular reference site on the Internet. It has been hailed as the quintessence of the “wisdom of crowds,” as a model of democratized information, and as a nail in the coffin of the “stodgy old commercial encyclopedia.” Others are concerned about the reliability of an uncontrolled reference work that may include any number of purposeful or accidental inaccuracies. Some observers wonder why anyone would accept information from anonymous strangers of unknown qualifications. WP’s first editor in chief, Larry Sanger, believes that an anti-expertise bias among “Wikipedians” foreshadows the death of accuracy in scholarship (“Why Wikipedia Must Jettison Its Anti-Elitism”; www.kuro5hin.org/story/2004/12/30/142458/25). Robert McHenry, former editor of Encyclopædia Britannica, is even more blunt in asserting that the community-accretion process of Wikipedia is fundamentally incapable of rising to a high standard of excellence (“The Faith-Based Encyclopedia”; www.techcentralstation.com/111504A.html). Regardless of which side you’re on, relying on Wikipedia presents numerous risks: • Accuracy: You cannot be sure which information is accurate and which is not. Misinformation has a negative value; even if you get it for free, you’ve paid too much. • Motives: You cannot know the motives of the contributors to an article. They may be altruists, political or commercial opportunists, practical jokers, or even vandals. • Uncertain Expertise: Some contributors exceed their expertise and supply speculations, rumors, hearsay, or incorrect information. It is difficult to determine how qualified an article’s contributors are; the revision histories often identify them by pseudonyms, making it difficult to check credentials and sources. • Volatility: Contributions and corrections may be negated by future contributors. One of the coauthors of this column found it disconcerting that he had the power to independently alter the Wikipedia article about himself. Volatility creates a conundrum for citations: Should you cite the version of the article that you read (meaning that those who follow your link may miss corrections and other improvements), or the latest version (which may differ significantly from the article you saw)? • Coverage: Voluntary contributions largely represent the interests and knowledge of a self-selected set of contributors. They are not part of a careful plan to organize human knowledge. Topics that interest the young and Internet-savvy are wellcovered, while events that happened “before the Web” may be covered inadequately or inaccurately, if at all. More is written about current news than about historical knowledge. • Sources: Many articles do not cite independent sources. Few articles contain citations to works not digitized and stored in the open Internet. The foregoing effects can pollute enough information to undermine trust in the work as a whole. The WP organizers are aware of some of these risks, acknowledging that “Wikipedia contains no formal peer review process for fact-checking, and the editors themselves may not be well-versed in the topics they write about.” The organizers have established a background editorial process to mitigate some of the risks. Still, no one stands officially behind the authenticity and accuracy of any information in WP. There is no mechanism for subject-matter authorities to review and vouch for articles. There are no processes to ferret out little-known facts and include them, or to ensure that the full range of human knowledge, past and present, is represented. The Wikipedia is an interesting social experiment in knowledge compilation and codification. However, it cannot attain the status of a true encyclopedia without more formal content-inclusion and expert review procedures.

Collaboration


Dive into the Peter J. Denning's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michael C. Mulder

Bonneville Power Administration

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nicholas Dew

Naval Postgraduate School

View shared research outputs
Top Co-Authors

Avatar

Paul Young

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Jack B. Dennis

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge