Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Robert L. Cannon is active.

Publication


Featured researches published by Robert L. Cannon.


Pattern Recognition | 1985

Iterative fuzzy image segmentation

T. L. Huntsherger; C. L. Jacobs; Robert L. Cannon

Abstract The multispectral signature of features has been used for identification of objects in remotely sensed scenes for a number of years. Recently these techniques have been applied to feature selection in natural scenes. Due to the inherent noise and degradation of the input cues to the algorithms, meaningful image segmentation is a difficult process. In an effort to reduce the sensitivity of a system to these problems, we have been led to the development of a iterative fuzzy clustering technique for image segmentation. It is believed that this method represents an image segmentation scheme which can be used as a preprocessor for a multivalued logic based computer vision system.


IEEE Transactions on Geoscience and Remote Sensing | 1986

Segmentation of a Thematic Mapper Image Using the Fuzzy c-Means Clusterng Algorthm

Robert L. Cannon; J. V. Dave; James C. Bezdek; Mohan M. Trivedi

In this paper, a segmentation procedure that utilizes a clustering algorithm based upon fuzzy set theory is developed. The procedure operates in a nonparametric unsupervised mode. The feasibility of the methodology is demonstrated by segmenting a six-band Landsat-4 digital image with 324 scan lines and 392 pixels per scan line. For this image, 100-percent ground cover information is available for estimating the quality of segmentation. About 80 percent of the imaged area contains corn and soybean fields near the peak of their growing season. The remaining 20 percent of the image contains 12 different types of ground cover classes that appear in regions of diffferent sizes and shapes. The segmentation method uses the fuzzy c-means algorithm in two stages. The large number of clusters resulting from this segmentation process are then merged by use of a similarity measure on the cluster centers. Results are presented to show that this two-stage process leads to separation of corn and soybean, and of several minor classes that would otherwise be overwhelmed in any practical one-stage clustering.


Journal of Sedimentary Research | 1984

Petrographic Image Analysis, I. Analysis of Reservoir Pore Complexes

Robert Ehrlich; Stephen K. Kennedy; Sterling James Crabtree; Robert L. Cannon

ABSTRACT There exists a need to relate the petrology of reservoirs (pore geometry, surface areas of mineral phases and pores) to geophysical and petrophysical data. The end result is improved assessment of reservoir quality as well as better interpretation of well logs and seismic data. Petrographic Image Analysis (PIA) was developed from the beginning to interface with petrophysical/geophysical data. PIA relies upon computer-based image analysis using pattern recognition/classification programs, and so information can be obtained very rapidly--the rate simply tied to sophistication of the computer in use. PIA consists of a critical mix of hardware and software which perform four separate functions: 1) image acquisition; 2) image digitization; 3) image segmentation; and 4) image analysis. A sp cial effort has been made to characterize the geometry of the pore complex. Separate spectra related to pore size and pore roughness are generated from each image. In addition, surface area per unit volume of pore can be estimated. Pore spectra can be decomposed and classified using pattern recognition/ classification algorithms or used directly to estimate physical parameters.


Journal of Geophysical Research | 1991

The simulation of the sedimentary fill of basins

C. G. St. C. Kendall; J. Strobel; Robert L. Cannon; James C. Bezdek; Gautam Biswas

This dissertation investigates the forward modeling of sedimentary basins using a simulation program, and the characterization of hydrocarbon fields or plays using an expert systems approach. The simulation program models processes associated with eustatic sea level, tectonic behavior, and rates of sediment accumulation in an attempt to explain the sediment geometries seen in a basin. The knowledge-based system, PLAYMAKER is designed as an interactive system to aid geologists in characterizing their fields or prospects. SEDPAK is designed as an interactive computer simulation tool. It erects models of sedimentary geometries by filling in a two-dimensional basin from both sides with a combination of clastic sediment and/or in situ and transported carbonate sediments. The geometries of the clastic and carbonate sediments evolve through time in response to depositional processes that include tectonic movement, eustasy, and sedimentation. Clastic modeling includes sedimentary bypass and erosion and sedimentation in alluvial and coastal plains, marine shelf, basin slope and basin floor settings. Carbonate modeling includes progradation, the development of hard grounds, down slope aprons, keep up, catch up, back step and drowned reef facies as well as lagoonal and epeiric facies. Also included in the model are extensional vertical faulting of the basin, sediment compaction, and isostatic response to sediment loading. PLAYMAKER is an interactive query/response knowledge-based system that elicits field attributes and their qualities from the user in order to characterize a hydrocarbon play. The geologic model developed in PLAYMAKER describes a prospect in terms of its essential characteristics, such as basin type, structural style and history, location of the depositional setting, sediment type and geometry, facies model, reservoir quality, and source and seal potential. The system is implemented using MIDST, a rule-based expert system shell that incorporates uncertain reasoning based on the Dempster-Shafer framework.


Computers & Geosciences | 1989

Interactive (SEDPAK) simulation of clastic and carbonate sediments in shelf to basin settings

J. Strobel; Robert L. Cannon; C.G.St.C. Kendall; Gautam Biswas; James C. Bezdek

Abstract SEDPAK is an interactive computer simulation which erects models of sedimentary geometries by infilling a two-dimensional basin from both sides with a combination of clastic sediment or in situ and transported carbonate sediments. The simulation program is implemented in “C” on an Apollo DN3000 workstation using graphical plotting functions. Data entry, including the initial basin configuration, local tectonic behavior, sealevel curves, amount and source direction of clastic sediment, and the growth rates of carbonates as a function of water depth is performed interactively. The modeled geometries of clastic and carbonate sediments evolve through time and respond to depositional processes that include tectonic movement, eustasy, and sedimentation. Clastic modeling includes sedimentary bypass, erosion, and sedimentation in alluvial and coastal plains, marine shelf, basin slope, and basin floor settings. Carbonate modeling includes progradation, the development of hard grounds, downslope aprons, keep up, catch up, back step, and drowned reef facies as well as lagoonal and epeiric facies. Also included in the model are extensional vertical faulting of the basin, sediment compaction, and isostatic response to sediment loading. Sediment geometries are plotted on a graphics terminal as they are computed, so the user can immediately view the results. Then, based on these observations, parameters can be changed repeatedly and the program rerun until the user is satisfied with the resultant geometry.


Graphical Models \/graphical Models and Image Processing \/computer Vision, Graphics, and Image Processing | 1988

Heuristics for intermediate level road finding algorithms

Sridhar Vasudevan; Robert L. Cannon; James C. Bezdek; William L. Cameron

This paper deals with detection of road networks in aerial imagery. In natural images, roadlike segments extracted by low level operators are highly fragmented. The main focus of this work is the development of an intermediate processing stage that addresses the task of partitioning and connecting the roadlike fragments. Heuristics based upon generally observed properties of roadlike features are used to cluster and integrate segments that appear perceptually continuous. The segments are represented as lists and processed symbolically. The proposed methodology is exemplified using three data sets: a suite of four synthetic segment sets which were used to test the sensitivity of the algorithm to thresholds and noise; a circularly polarized millimeter wave (TABILS 5) radar pseudo image; and a thematic mapper (LANDSAT) image.


Simulation | 1981

A generator program for models of discrete-event systems:

Eswaran Subrahmanian; Robert L. Cannon

Researchers in simulation methodology have been attempting to develop both theories of modelling and software tools to assist in applying those theories. Using results from system theory and automata theory, Oren and Zeigler have demonstrated the need for the design and development of software for modular con struction of models; manipulation, documentation, validation, and generation of simulation programs; and automatic verification of algorithms. We have implemented a software system based on their propo sals. The user describes the structure of the model in a well-defined language. At this time, our system supports only discrete-event models using a next- event strategy; a typical example is a queuing model of service at a bank. However, this simple system demonstrates the feasibility of a more complex system.


Graphical Models \/graphical Models and Image Processing \/computer Vision, Graphics, and Image Processing | 1986

Decomposition and approximation of three-dimensional solids

Tsaiyun Phillips; Robert L. Cannon; Azriel Rosenfeld

Abstract In order to determine the physical properties of a rock sample represented digitally as a set of serial cross sections it is necessary first to decompose the sample into discrete objects and then to approximate each of those objects by another with well-defined mathematical properties. For decomposition, the convex enclosure is defined and shown to be a good approximation of the three-dimensional convex hull yet less complex to calculate. The convex enclosure approximates the convex hull of simple objects with error no more than twenty %, and the enclosure deficiency can be used as a measure of the compactness of the object. Using this measure, it is possible to determine whether an object is sufficienty complex to require its decomposition into a set of subobjects. The decomposition procedure continues recursively until each subobject is sufficiently compact. The subobjects are then approximated by ellipsoids. For each subobject, the axes of the approximating ellipsoid are given by the eigenvalues and eigenvectors of the matrix used in the computation of its convex enclosure.


technical symposium on computer science education | 2002

Teaching a software project course using the team software process

Robert L. Cannon; Thomas B. Hilburn; Jorge L. Díaz-Herrera

The tutorial is intended for faculty that will be teaching or have taught a software project course. It provides attendees with ideas, concepts, guidelines and experiences for teaching such a course using the introductory Team Software Process.


International Journal of Pattern Recognition and Artificial Intelligence | 1990

PLAYMAKER: A KNOWLEDGE-BASED APPROACH TO CHARACTERIZING HYDROCARBON PLAYS

Gautam Biswas; Xudong Yu; William J. Hagins; James C. Bezdek; J. Strobel; Christopher G. St. C. Kendall; Robert L. Cannon

This paper discusses the design and implementation of PLAYMAKER, a knowledge-based system for characterizing hydrocarbon plays. PLAYMAKER is a component of XX (eXpert eXplorer), a workstation-based tool that aids exploration geologists in a number of different tasks: sediment and carbonate simulation, play and field characterization, retrieval and storage of information in a geological database, comparison of the play or field under study with other fields in the database, and report generation. PLAYMAKER is implemented using MIDST (Mixed Inferencing Dempster-Shafer Tool), a rule-based expert system shell that incorporates mixed-initiative and inexact reasoning based on the Dempster-Shafer evidence combination scheme. This paper discusses the effectiveness of a two-level knowledge base structure adopted for the design and implementation of PLAYMAKER.

Collaboration


Dive into the Robert L. Cannon's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

James C. Bezdek

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

J. Strobel

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

James C. Bezdek

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Philip Moore

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Robert Ehrlich

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

C.G.St.C. Kendall

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge