Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kenneth J. Hintz is active.

Publication


Featured researches published by Kenneth J. Hintz.


systems man and cybernetics | 1991

Multi-process constrained estimation

Kenneth J. Hintz; Eugene S. McVey

A method that maximizes the information flow through a constrained communications channel when it is desired to estimate the state of multiple nonstationary processes is described. The concept of a constrained channel is introduced as a channel that is not capable of transferring all of the information required. A measure of information is developed based on the estimation entropy utilizing the Kalman filter state estimator. It is shown that this measure of information can be used to determine which process to observe in order to maximize a measure of global information flow. For stationary processes, the sampling sequence can be computed a priori, but nonstationary processes require real-time sequence computation. >


Proceedings of the IEEE | 1993

Fractional Brownian motion models for synthetic aperture radar imagery scene segmentation

Clayton V. Stewart; Baback Moghaddam; Kenneth J. Hintz; Leslie M. Novak

The application of fractal random process models and their related scaling parameters as features in the analysis and segmentation of clutter in high-resolution, polarimetric synthetic aperture radar (SAR) imagery is demonstrated. Specifically, the fractal dimension of natural clutter sources, such as grass and trees, is computed and used as a texture feature for a Bayesian classifier. The SAR shadows are segmented in a separate manner using the original backscatter power as a discriminant. The proposed segmentation process yields a three-class segmentation map for the scenes considered in this study (with three clutter types: shadows, trees, and grass). The difficulty of computing texture metrics in high-speckle SAR imagery is addressed. In particular, a two-step preprocessing approach consisting of polarimetric minimum speckle filtering followed by noncoherent spatial averaging is used. The relevance of the resulting segmentation maps to constant-false-alarm-rate (CFAR) radar target detection techniques is discussed. >


systems man and cybernetics | 1991

A measure of the information gain attributable to cueing

Kenneth J. Hintz

The high false alarm rate of automatic target recognition systems used in conjunction with imaging systems has precluded their use as completely autonomous devices for the targeting and launching of weapons in military systems and has prompted their use primarily as an aid to final target recognition by a trained observer. Automatic target recognition systems used in this manner are termed cuers and are assumed by many to improve the probability of detection and recognition of a target by an observer searching an image. The efficacy of these cuers and methods of evaluating their performance are under investigation since no quantifiable measure exists. The author addresses the automated part of the problem and presents a proposed objective measure of cuer effectiveness based on an information measure. The usefulness of this measure is in discriminating among competing cuer designs. Cuers are classified according to the type of information that they extract from the image and a measure of information for each type of cuer is proposed. >


Proceedings of SPIE | 1996

Information theoretic approach to sensor scheduling

Gregory A. McIntyre; Kenneth J. Hintz

This paper demonstrates an approach to sensor scheduling and sensor management which effectively deals with the search/track decision problem. Every opportunity a sensor has to sense the environment equates to a certain amount of information which can be obtained about the state of the environment. A fundamental question is how to use this potential information to manage a suite of sensors while maximizing ones net knowledge about the state of the environment. The fundamental problem is whether to use ones resources to track targets already in track or to search for new ones. Inherent in this search/track problem is the further decision as to which sensor to use. A computer model has been developed that simulates a modest multiple sensor, multiple threat scenario. Target maneuvers are modeled using the Singer Model for manned maneuvering vehicles. Each sensors capabilities and characteristics are captured in the model by converting their energy constraints to a probability of detecting a target as a function of range and field of view (beamwidth). The environment is represented by a probability distribution of a target being at a given location. As the environment is sensed and targets are detected, the environments probability distribution is continually updated to reflect the new probability state of the environment. This probability state represents the systems best estimate about the location of all targets in track and the probable location of, as yet undetected, targets.


The FASEB Journal | 2006

Can specific biological signals be digitized

Wayne B. Jonas; John A. Ives; Florence M. Rollwagen; Daniel W. Denman; Kenneth J. Hintz; Mitchell R. Hammer; Cindy Crawford; Kurt Henry

At the request of the United States Defense Advanced Research Projects Agency, we attempted to replicate the data of Professor Jacques Benveniste that digital signals recorded on a computer disc produce specific biological effects. The hypothesis was that a digitized thrombin inhibitor signal would inhibit the fibrinogen‐thrombin coagulation pathway. Because of the controversies associated with previous research of Prof. Benveniste, we developed a system for the management of social controversy in science that incorporated an expert in social communication and conflict management. The social management approach was an adaptation of interactional communication theory, for management of areas that interfere with the conduct of good science. This process allowed us to successfully complete a coordinated effort by a multidisciplinary team, including Prof. Benveniste, a hematologist, engineer, skeptic, statistician, neuroscientist and conflict management expert. Our team found no replicable effects from digital signals.— Jonas, W. B., Ives, J. A., Rollwagen, F., Denman, D. W., Hintz K., Hammer, M., Crawford, C., Henry, K. Can specific biological signals be digitized? FASEB J. 20, 23–28 (2006)


Signal processing, sensor fusion and target recognition. Conference | 1999

Goal lattices for sensor management

Kenneth J. Hintz; Gregory A. McIntyre

A new methodology for quantifying the relative contribution of specific sensor actions to a set of mission goals is presented. The mission goals are treated as a set, and an ordering relationship is applied to it leading to a partially ordered set which can be represented as a lattice. At each layer in the lattice, each goals value is computed as the sum of the (higher) goals in which it is included and its value is apportioned among the (lower) goals which it includes. A system designer is forced to make a zero-sum apportionment of each goals value among those goals which it includes. The net result of this methodology is a quantifiable measure of the contributing value of each real type of sensor action to the system of goals, leading to more effective allocation of resources. While applied here to sensor scheduling, the method has applications to other decision making processes as well.


Optical Engineering | 1998

Sensor measurement scheduling: an enhanced dynamic, preemptive algorithm

Gregory A. McIntyre; Kenneth J. Hintz

This paper presents an enhanced architecture for a sensor measurement scheduler as well as a dynamic sensor scheduling algorithm called the On-line, Greedy, Urgency-driven, Preemptive Scheduling Algorithm (OGUPSA). The premise is that the function of sensor management can be partitioned into the two tasks of information management, essentially an information to measurement mapping, and a sensor scheduler which takes the measurement requests along with their priorities and optimally maps them to a set of sensors. OGUPSA was developed using the three main scheduling policies of Most-Urgent-First to pick a task, EarliestCompleted-First to select a sensor, and Least-Versatile-First to resolve ties. By successive application of these policies. OGUPSA dynamically allocates, schedules, and distributes a set of measurement tasks from an information manager among a set of sensors. OGUPSA can detect the failure of a measurement task to meet a deadline and improves the dynamic load balance among all sensors while being a polynomial time algorithm. One of the key components of OGUPSA is the information in the applicable sensor table. This table is the mechanism that is used to assign requested tasks to specific sensors. Subject terms: sensor scheduling, sensor management, resource scheduling, sensors


international conference on multimedia information networking and security | 2004

SNR improvements in NIITEK ground-penetrating radar

Kenneth J. Hintz

The spatial resolution and peak signal to average noise ratio of the NIITEK ground penetrating RADAR is shown to be improved by the application of an inverse point spread function operation. Both 1-D and 2-D point spread functions are developed. The 1-D inverse PSF is developed from the step discontinuity in impedance due to the air/ground interface. The 2-D inverse PSF is developed from an image of a small sphere in air. The sequential application of the two inverse PSFs using convolution in the spatial domain compensates for two distinctly different effects. The 1-D blurring is due to AC coupling of the RADAR which produces a bipolar derivative of the narrow RADAR pulse. This is replaced by a single pulse in range which represents the location of an impedance discontinuity. The 2-D blurring is due to the wide beamwidth of the adjacent channels of the 24 channel linear array and the operation in the near field of the antenna which causes the characteristic parabolic scattering of a point signal due to adjacent channel crosstalk. Over a set of 32 landmines at a government test site an average improvement in peak signal to RMS noise ratio of 7.19 dB for 1-D only inverse filtering is realized. When 1-D processing is combined with 2-D inverse filtering, an average SNR improvement of 7.68 is realized with significant spatial resolution improvement. Subsequent processing with a moving average filter of size 2 and 4 in depth yields 8.14 and 9.88 dB net SNR improvement respectively.


Proceedings of SPIE | 1998

Comparison of several maneuvering target tracking models

Gregory A. McIntyre; Kenneth J. Hintz

The tracking of maneuvering targets is complicated by the fact that acceleration is not directly observable or measurable. Additionally, acceleration can be induced by a variety of sources including human input, autonomous guidance, or atmospheric disturbances. The approaches to tracking maneuvering targets can be divided into two categories both of which assume that the maneuver input command is unknown. One approach is to model the maneuver as a random process. The other approach assumes that the maneuver is not random and that it is either detected or estimated in real time. The random process models generally assume one of two statistical properties, either white noise or an autocorrelated noise. The multiple-model approach is generally used with the white noise model while a zero-mean, exponentially correlated acceleration approach is used with the autocorrelated noise model. The nonrandom approach uses maneuver detection to correct the state estimate or a variable dimension filter to augment the state estimate with an extra state component during a detected maneuver. Another issue with the tracking of maneuvering target is whether to perform the Kalman filter in Polar or Cartesian coordinates. This paper will examine and compare several exponentially correlated acceleration approaches in both Polar and Cartesian coordinates for accuracy and computational complexity. They include the Singer model in both Polar and Cartesian coordinates, the Singer model in Polar coordinates converted to Cartesian coordinates, Helfertys third order rational approximation of the Singer model and the Bar-Shalom and Fortmann model. This paper shows that these models all provide very accurate position estimates with only minor differences in velocity estimates and compares the computational complexity of the models.


Characterization, Propagation, and Simulation of Sources and Backgrounds | 1991

Dimension and lacunarity measurement of IR images using Hilbert scanning

Baback Moghaddam; Kenneth J. Hintz; Clayton V. Stewart

This paper illustrates the use of fractal geometry and fractal metrics for analysis and characterization of natural textures and clutter in IR images in the wavelength band of 2-5 micrometers . In addition to the local fractal dimension, the lacunarity of textures is also briefly investigated. The addition of lacunarity significantly improves the pattern classification performance and is an important part of a complete fractal description of natural textures. A new measurement technique, based on the statistics of a space-filling curve, is presented. Specifically, a space-filling scan of an image texture is used to estimate the fractal dimension of the corresponding intensity surface. This unique one-dimensional representation is also used for measuring local texture features such as granularity and lacunarity.

Collaboration


Dive into the Kenneth J. Hintz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Catherine Cole

University of St Andrews

View shared research outputs
Top Co-Authors

Avatar

Nicola Allison

University of St Andrews

View shared research outputs
Top Co-Authors

Avatar

Erik Blasch

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge