Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where John M. Colombi is active.

Publication


Featured researches published by John M. Colombi.


Neural Networks | 1995

Neural networks for automatic target recognition

Steven K. Rogers; John M. Colombi; Curtis E. Martin; James C. Gainey; Kenneth H. Fielding; Tom J. Burns; Dennis W. Ruck; Matthew Kabrisky; Mark E. Oxley

Abstract Many applications reported in artificial neural networks are associated with military problems. This paper reviews concepts associated with the processing of military data to find and recognize targets—automatic target recognition (ATR). A general-purpose automatic target recognition system does not exist. The work presented here is demonstrated on military data, but it can only be consideredproof of principle until systems are fielded andproven “under-fire”. ATR data can be in the form of non-imaging one-dimensional sensor returns, such as ultra-high range-resolution radar returns for air-to-air automatic target recognition and vibration signatures from a laser radar for recognition of ground targets. The ATR data can be two-dimensional images. The most common ATR images are infrared, but current systems must also deal with synthetic aperture radar images. Finally, the data can be three-dimensional, such as sequences of multiple exposures taken over time from a nonstationary world. Targets move, as do sensors, and that movement can be exploited by the ATR. Hyperspectral data, which are views of the same piece of the world looking at different spectral bands, is another example of multiple image data; the third dimension is now wavelength and not time. ATR system design usually consists of four stages. The first stage is to select the sensor or sensors to produce the target measurements. The next stage is the preprocessing of the data and the location of regions of interest within the data (segmentation). The human retina is a ruthless preprocessor. Physiology motivated preprocessing and segmentation is demonstrated along with supervised and unsupervised artificial neural segmentation techniques. The third design step is feature extraction and selection: the extraction of a set of numbers which characterize regions of the data. The last step is the processing of the features for decision making (classification). The area of classification is where most ATR related neural network research has been accomplished. The relation of neural classifiers to Bayesian techniques is emphasized along with the more recent use of feature sequences to enhance classification. The principal theme of this paper is that artificial neural networks have proven to be an interesting and useful alternate processing strategy. Artificial neural techniques, however, are not magical solutions with mystical abilities that work without good engineering. Good understanding of the capabilities and limitations of neural techniques is required to apply them productively to ATR problems.


Systems Engineering | 2013

An ontological framework for clarifying flexibility-related terminology via literature survey

Erin T. Ryan; David R. Jacques; John M. Colombi

Despite its ubiquity in the systems engineering literature, flexibility remains an ambiguous concept. There exist a multitude of definitions, which vary not only by domain, but within domains as well. Furthermore, these definitions often conflict with one another, making it difficult to discern the intended meaning in a given study or to form generalizations across studies. Complicating matters, there is a plethora of related terminology that is often used carelessly and/or inter-changeably with flexibility. In this paper, we employ a novel ontological framework for clarifying salient aspects of extant flexibility-related terminology. While it was not possible to distill consensus definitions from the literature, we did identify certain dominant characteristics that enabled us to formulate a set of democratic definitions for flexibility, adaptability, and robustness, as well as recommended definitions for agility and versatility. We believe that the proposed definitions of these key system design principles may provide a baseline for improving analysis and communication among systems engineering practitioners and academics.


international conference on acoustics speech and signal processing | 1996

Cohort selection and word grammar effects for speaker recognition

John M. Colombi; Dennis W. Ruck; Timothy R. Anderson; Steven K. Rogers; Mark E. Oxley

Automatic speaker recognition systems are maturing and databases have been designed to specifically compare algorithms and results to target error rates. The LDC YOHO speaker verification database was designed to test error rates at the 1% false rejection and 0.1% false acceptance level. This work examines the use of speaker-dependent (SD) monophone models to meet these requirements. By representing each speaker with 22 monophones, both closed-set speaker identification and global-threshold verification was performed. Using four combination lock phrases, speaker identification error rates are obtained at 0.19% for males and 0.31% for females. By defining a test hypothesis, a critical error analysis for speaker verification is developed and new results reported for YOHO. A new Bhattacharyya distance is developed for cohort selection. This method, based on the second order statistics of the enrolment Viterbi log-likelihoods, determines the optimal cohorts and achieves an equal error rate of 0.282%.


Procedia Computer Science | 2013

A Fuzzy Evaluation method for System of Systems Meta-architectures☆

Louis Pape; Kristin Giammarco; John M. Colombi; Cihan H. Dagli; Nil H. Kilicay-Ergin; George Rebovich

Abstract A method is proposed for evaluating a range of System of Systems (SoS) meta-architecture alternatives. SoS are composed through combination of existing, fully functioning Systems, possibly with minor functional changes, but certainly by using the combined Systems to achieve a new capability, not available from the Systems alone. The meta-architecture describes how all possible subsets of Systems can be combined to create an SoS. The fitness of a realizable SoS architecture may be characterized by terms such as unacceptable, marginal, above average, or excellent. While these terms provide little information about the SoS when used alone and informally, they readily fit into fuzzy membership sets that overlap at their boundaries. More descriptive attributes such as “ease of use,” which might depend on individual user and a set of conditions, “mission effectiveness” over a particular suite of missions, and “affordability,” which may change over time with changing business climate, etc., lend themselves readily to fuzzy evaluation as well. An approach to defining the fuzzy concepts and establishing rule sets to provide an overall SoS evaluation for many sets of participating individual Systems represented by the meta-architecture is discussed. An application of the method is discussed within the framework of developing and evaluating a hypothetical Intelligence, Surveillance and Reconnaissance (ISR) SoS capability.


Systems Engineering | 2012

Predictive mental workload modeling for semiautonomous system design: Implications for systems of systems

John M. Colombi; Michael E. Miller; Michael Schneider; Major Jason McGrogan; Colonel David S. Long; John Plaga

Predictive mental workload modeling is one established tool within the broad systems engineering activity of Human Systems Integration (HSI). Using system architecture as the foundation, this paper explores the use of Multiple Resource Theory to create representative workload models for evaluating operational system-of-systems (SoS) concepts. Through careful consideration of task demands, conflict generated between tasks, and workload mitigation strategies, informed design decision can improve overall human-system performance. An example involving a single pilot controlling multiple remotely piloted aircraft (RPA) is presented to illustrate the use of workload modeling. Several observations are made that drive measurably excessive workload: multitasking, communications, continuously updating situational awareness and mission planning. In addition, three metrics are proposed for incorporating human workload analysis during system design. This technique has applicability across a wide range systems-of-systems and operational concepts involving complex human-system interactions.


IEEE Access | 2015

A Modeling Framework for Studying Quantum Key Distribution System Implementation Nonidealities

Logan O. Mailloux; Jeffrey D. Morris; Michael R. Grimaila; Douglas D. Hodson; David R. Jacques; John M. Colombi; Colin V. McLaughlin; Jennifer A. Holes

Quantum key distribution (QKD) is an innovative technology that exploits the laws of quantum mechanics to generate and distribute unconditionally secure shared key for use in cryptographic applications. However, QKD is a relatively nascent technology where real-world system implementations differ significantly from their ideal theoretical representations. In this paper, we introduce a modeling framework built upon the OMNeT++ discrete event simulation framework to study the impact of implementation nonidealities on QKD system performance and security. Specifically, we demonstrate the capability to study the device imperfections and practical engineering limitations through the modeling and simulation of a polarization-based, prepare and measure BB84 QKD reference architecture. The reference architecture allows users to model and study complex interactions between physical phenomenon and system-level behaviors representative of real-world design and implementation tradeoffs. Our results demonstrate the flexibility of the framework to simulate and evaluate current, future, and notional QKD protocols and components.


The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology | 2009

A General Method of Measuring Interoperability and Describing Its Impact on Operational Effectiveness

Thomas C. Ford; John M. Colombi; David R. Jacques; Scott R. Graham

A general method of measuring the interoperability of a heterogeneous set of systems, experiencing any type and number of interoperations, in the context of an operational process is given. Furthermore, for confrontational operations (friendly versus adversary), the method gives sufficient conditions for relating the interoperability measurement to operational effectiveness. Owing to the difficulty in creating a general method of interoperability measurement, developers of extant interoperability assessment methods have relied upon problem decomposition to produce methods of assessing the interoperability of specific types of systems experiencing distinct modes of interoperation. Unfortunately, this approach fractured the problem, effectively driving them further from the solution.Therefore, in this research, a holistic, fundamental, and flexible means of describing a general method of interoperability measurement was undertaken which models systems according to their interoperability-related features in the context of an operational process.An application of the method highlights the new concept of confrontational interoperability, demonstrates the relationship between interoperability and operational effectiveness in the context of a suppression of enemy air defenses (SEAD) scenario, and illustrates how an interoperability measurement can motivate system upgrades.


The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology | 2015

Modeling decoy state Quantum Key Distribution systems

Logan O. Mailloux; Ryan D. L. Engle; Michael R. Grimaila; Douglas D. Hodson; John M. Colombi; C V McLaughlin

Quantum Key Distribution (QKD) is an innovative technology which exploits the laws of quantum physics to generate and distribute shared secret key for use in cryptographic devices. Quantum Key Distribution offers the advantage of ‘unconditionally secure’ key generation with the unique ability to detect eavesdropping on the key distribution channel and shows promise for high-security applications such as those found in banking, government, and military environments. However, Quantum Key Distribution is a nascent technology where realized systems suffer from implementation non-idealities, which may significantly impact system performance and security. In this article, we discuss the modeling of a decoy state enabled Quantum Key Distribution system built to study the impact of these practical limitations. Specifically, we present a thorough background on the decoy state protocol, detailed discussion of the modeled decoy state enabled Quantum Key Distribution system, and evidence for component and sub-system verification, as well as, multiple examples of system-level validation. Additionally, we bring attention to practical considerations associated with implementing the decoy state protocol security condition gained from these research activities.


international conference on system of systems engineering | 2012

Modeling system of systems acquisition

Nil H. Kilicay-Ergin; Paulette Acheson; John M. Colombi; Cihan H. Dagli

System of systems (SoS) acquisition is a dynamic process of integrating independent systems. This paper describes modeling of the SoS acquisition environment based on the Wave Process Model. Agent-based modeling methodology is utilized to abstract behavioral aspects of the acquisition process.


Systems Engineering | 2015

Disaggregated Space System Concept Optimization: Model-Based Conceptual Design Methods

Robert Thompson; John M. Colombi; Jonathan T. Black; Bradley Ayres

Optimal design techniques have proven to be an effective systems engineering tool. Using systems architecture as the foundation, this paper explores the use of mixed variable optimization models for synthesizing and evaluating disaggregated space system concepts. Model-based conceptual design MBCD techniques are used to identify and assess system architectures based upon estimated system cost and performance trades. The Disaggregated Integral System Concept Optimization DISCO methodology is introduced, and then applied to a space-based, fire detection mission. Several results are obtained that indicate potential cost effectiveness gains from the concept design optimization of a fire detection mission. The general methodology has broad applicability for MBCD of systems, but is particularly useful for dynamic, nonlinear disaggregated space systems.

Collaboration


Dive into the John M. Colombi's collaboration.

Top Co-Authors

Avatar

David R. Jacques

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Michael E. Miller

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Michael R. Grimaila

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Nicholas Hardman

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dennis W. Ruck

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Richard G. Cobb

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Thomas C. Ford

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Steven K. Rogers

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Alan W. Johnson

Air Force Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Erin T. Ryan

Air Force Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge