Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dale C. Linne von Berg is active.

Publication


Featured researches published by Dale C. Linne von Berg.


Applied Optics | 2014

Imaging systems and applications: Introduction to the feature

Francisco Imai; Dale C. Linne von Berg; T. Skauli; Shoji Tominaga; Zeev Zalevsky

Imaging systems have numerous applications in industrial, military, consumer, and medical settings. Assembling a complete imaging system requires the integration of optics, sensing, image processing, and display rendering. This issue features original research ranging from design of stimuli for human perception, optics applications, and image enhancement to novel imaging modalities in both color and infrared spectral imaging, gigapixel imaging as well as a systems perspective to imaging.


Proceedings of SPIE | 2010

Multisensor airborne imagery collection and processing onboard small unmanned systems

Dale C. Linne von Berg; Scott A. Anderson; Alan Bird; Niel Holt; Melvin R. Kruer; Thomas J. Walls; Michael L. Wilson

FEATHAR (Fusion, Exploitation, Algorithms, and Targeting for High-Altitude Reconnaissance) is an ONR funded effort to develop and test new tactical sensor systems specifically designed for small manned and unmanned platforms (payload weight < 50 lbs). This program is being directed and executed by the Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory (SDL). FEATHAR has developed and integrated EyePod, a combined long-wave infrared (LWIR) and visible to near infrared (VNIR) optical survey & inspection system, with NuSAR, a combined dual band synthetic aperture radar (SAR) system. These sensors are being tested in conjunction with other ground and airborne sensor systems to demonstrate intelligent real-time cross-sensor cueing and in-air data fusion. Results from test flights of the EyePod and NuSAR sensors will be presented.


SPIE's International Symposium on Optical Science, Engineering, and Instrumentation | 1998

Image compression for airborne reconnaissance

Dale C. Linne von Berg; Melvin R. Kruer

The volume of digital imagery generated by existing and planned airborne reconnaissance systems requires the use of lossy compression techniques in order to store the real-time imagery on-board or transmit it to a ground station. The government is migrating compression used for reconnaissance applications from proprietary techniques to a national and international standards in order to provide a more seamless image dissemination path. This paper describes the requirements for image compression in advanced tactical reconnaissance system and compares the performance of national and international lossy compression techniques.


Proceedings of SPIE | 2010

Use of compact synthetic aperture radar systems to assist with device detection and discrimination

Mark Jensen; Thomas J. Walls; Scott A. Anderson; Dale C. Linne von Berg; Niel Holt; Melvin R. Kruer; David G. Long; Michael L. Wilson

NuSAR (Naval Research Laboratory Unmanned Synthetic Aperture Radar) is a sensor developed under the ONRfunded FEATHAR (Fusion, Exploitation, Algorithms, and Targeting for High-Altitude Reconnaissance) program. FEATHAR is being directed and executed by the Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory (SDL). FEATHARs goal is to develop and test new tactical sensor systems specifically designed for small manned and unmanned platforms (payload weight < 50 lbs). NuSAR is a novel dual-band (L- and X-band) SAR capable of a variety of tactically relevant operating modes and detection capabilities. Flight test results will be described for narrow and wide bandwidth and narrow and wide azimuth aperture operating modes.


Proceedings of SPIE | 2010

Compact survey and inspection day/night image sensor suite for small unmanned aircraft systems (EyePod)

Alan Bird; Scott A. Anderson; Dale C. Linne von Berg; Morgan Davidson; Niel Holt; Melvin R. Kruer; Michael L. Wilson

EyePod is a compact survey and inspection day/night imaging sensor suite for small unmanned aircraft systems (UAS). EyePod generates georeferenced image products in real-time from visible near infrared (VNIR) and long wave infrared (LWIR) imaging sensors and was developed under the ONR funded FEATHAR (Fusion, Exploitation, Algorithms, and Targeting for High-Altitude Reconnaissance) program. FEATHAR is being directed and executed by the Naval Research Laboratory (NRL) in conjunction with the Space Dynamics Laboratory (SDL) and FEATHARs goal is to develop and test new tactical sensor systems specifically designed for small manned and unmanned platforms (payload weight < 50 lbs). The EyePod suite consists of two VNIR/LWIR (day/night) gimbaled sensors that, combined, provide broad area survey and focused inspection capabilities. Each EyePod sensor pairs an HD visible EO sensor with a LWIR bolometric imager providing precision geo-referenced and fully digital EO/IR NITFS output imagery. The LWIR sensor is mounted to a patent-pending jitter-reduction stage to correct for the high-frequency motion typically found on small aircraft and unmanned systems. Details will be presented on both the wide-area and inspection EyePod sensor systems, their modes of operation, and results from recent flight demonstrations.


Airborne Intelligence, Surveillance, Reconnaissance (ISR) Systems and Applications V | 2008

DUSTER: demonstration of an integrated LWIR-VNIR-SAR imaging system

Michael L. Wilson; Dale C. Linne von Berg; Melvin R. Kruer; Niel Holt; Scott A. Anderson; David G. Long; Yuly Margulis

The Naval Research Laboratory (NRL) and Space Dynamics Laboratory (SDL) are executing a joint effort, DUSTER (Deployable Unmanned System for Targeting, Exploitation, and Reconnaissance), to develop and test a new tactical sensor system specifically designed for Tier II UAVs. The system is composed of two coupled near-real-time sensors: EyePod (VNIR/LWIR ball gimbal) and NuSAR (L-band synthetic aperture radar). EyePod consists of a jitter-stabilized LWIR sensor coupled with a dual focal-length optical system and a bore-sighted high-resolution VNIR sensor. The dual focal-length design coupled with precision pointing an step-stare capabilities enable EyePod to conduct wide-area survey and high resolution inspection missions from a single flight pass. NuSAR is being developed with partners Brigham Young University (BYU) and Artemis, Inc and consists of a wideband L-band SAR capable of large area survey and embedded real-time image formation. Both sensors employ standard Ethernet interfaces and provide geo-registered NITFS output imagery. In the fall of 2007, field tests were conducted with both sensors, results of which will be presented.


Rundbrief Der Gi-fachgruppe 5.10 Informationssystem-architekturen | 2014

Navy Imaging Systems

Dale C. Linne von Berg

Naval environments require sensing modalities that span the spectral imaging regime. Recent advances in optics and spectral sensor design, sensor fusion, and image processing techniques are described relevant to unique Navy airborne and maritime applications.


Proceedings of SPIE | 2014

Folded path LWIR system for SWAP constrained platforms

Erin Fleet; Michael L. Wilson; Dale C. Linne von Berg; Thomas Giallorenzi; Barry Mathieu

Folded path reflection and catadioptric optics are of growing interest, especially in the long wave infrared (LWIR), due to continuing demands for reductions in imaging system size, weight and power (SWAP). We present the optical design and laboratory data for a 50 mm focal length low f/# folded-path compact LWIR imaging system. The optical design uses 4 concentric aspheric mirrors, each of which is described by annular aspheric functions well suited to the folded path design space. The 4 mirrors are diamond turned onto two thin air-spaced aluminum plates which can be manually focused onto the uncooled LWIR microbolometer array detector. Stray light analysis will be presented to show how specialized internal baffling can be used to reduce stray light propagation through the folded path optical train. The system achieves near diffraction limited performance across the FOV with a 15 mm long optical train and a 5 mm back focal distance. The completed system is small enough to reside within a 3 inch diameter ball gimbal.


Proceedings of SPIE | 2011

System considerations of aerial infrared imaging for wide-area persistent surveillance

Melvin R. Kruer; John N. Lee; Dale C. Linne von Berg; John G. Howard; Jason Edelberg

Wide field-of-view infrared sensor and data acquisition and exploitation systems are being developed and tested for detecting activity and threats over extended areas. Limitations on the total number of pixels available in infrared arrays precipitate sensor design discussions on achieving the widest total field-of-view while achieving small ground sample distance to allow automated tracking and activity detection. In order to allow accurate imagery geo-location, the sensors optical characteristics as well as its location and orientation must be accurately recorded with each image. This paper will discuss system considerations of infrared imaging sensors for wide area persistent surveillance. We will present some uses of an advanced day/night sensors for wide area persistent surveillance that use large, high quality mid-wave infrared (MWIR) staring arrays in a fast step-stare stabilized mount and a Windows based data acquisition and exploitation system.


Algorithms and technologies for multispectral, hyperspectral, and ultraspectral imagery. Conference | 2005

Extending application of spectral object signature transforms: background candidate assessment

Rulon Mayer; Frank Bucholtz; Eric Allman; Dale C. Linne von Berg; Mel Kruer

Previous studies introduced, examined, and tested a variety of registration-free transforms, specifically the diagonal, whitening/dewhitening, and target CV (Covariance) transforms. These transforms temporally evolve spectral object signatures under varying conditions using imagery of regions of similar objects and content distribution from data sets collected at two different times. The transformed object signature is then inserted into the matched filter to search for targets. Spatial registration of two areas and/or finding two suitable candidate regions for the transforms is often problematic. This study examines and finds that the average correlation coefficient between the corrected histograms of the multi-spectral image cube collected at two times can assess the similarity of the areas and predict object detection performance. This metric is applied in four distinctive situations and tested on three independently collected data sets. In one data set, the correlation between histograms was taken from an airborne long wave infrared sensor that imaged objects in Florida and tested on registered images modified by systematically eliminating opposed ends of the image set. The other data set examined images of objects in Yellowstone National Park from a visible/near IR multi-spectral sensor. This comparison was also applied to images collected using oblique angles (depression angle of 10°) of objects placed at Webster Field in Maryland. Candidate heterogeneous image areas were compared to each other using the average correlation coefficient and inserted into statistical transforms. In addition the correlations were computed between corrected histograms based on the normalized difference vegetation index (NDVI). Similarly, the analysis is applied to data collected at oblique angles (10° depression angle). The net signal to clutter ratio depends on the average correlation coefficient and has low p-values (p<0.05). All statistical transforms (diagonal, whitening/dewhitening, target CV) performed comparably using the various backgrounds and scenarios. Objects that are spectrally distinct from the backgrounds followed the average correlation coefficient more closely than objects whose spectral signatures contained background components. This study is the first to examine the similarity of the corrected histograms and does not exclude other approaches for comparing areas.

Collaboration


Dive into the Dale C. Linne von Berg's collaboration.

Top Co-Authors

Avatar

Melvin R. Kruer

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

John N. Lee

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

M. D. Duncan

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Michael L. Wilson

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alan Bird

Utah State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

John G. Howard

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Thomas J. Walls

United States Naval Research Laboratory

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge