Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Peter Hecker is active.

Publication


Featured researches published by Peter Hecker.


international conference on pattern recognition | 2000

Weather independent flight guidance: analysis of MMW radar images for approach and landing

Bernd Korn; Hans-Ullrich Doehler; Peter Hecker

We present a system which provides such navigation information based on the analysis of millimeter wave (MMW) radar data. The advantage of MMW radar sensors is that the data acquisition is independent from the actual weather and daylight situation. The core part of the presented system is a fuzzy rule based inference machine which controls the data analysis based on the uncertainty in the actual knowledge in combination with a-priori knowledge. Compared with standard TV or IR images the quality of MMW images is rather poor and the data are highly corrupted with noise and clutter. Therefore, one main task of the inference machine is to handle uncertainties as well as ambiguities and inconsistencies to draw the right conclusions. The performance of our approach is demonstrated with real data acquired during extensive flight tests to several airports in Northern Germany.


Enhanced and synthetic vision 2000. Conference | 2000

MMW radar based navigation : Solutions of the vertical position problem

Bernd Korn; Hans-Ullrich Doehler; Peter Hecker

The acquisition of navigation data is an important upgrade of enhanced vision (EV) systems. E.g. the position from an aircraft relative to the runway during landing approaches has to be derived from data of the EV sensors directly, if no ILS or GPS navigation information is available. Due to its weather independence MMW radar plays an important role among possible EV sensors. Generally, information about the altitude of the aircraft relative to a target ahead (the runway) is not available within radar data. A common approach to overcome this so called vertical position problem is the use of the Flat Earth Assumption, i.e. the altitude above the runway is assumed to be the same as the actual altitude of the aircraft measured by the radar altimeter. Another approach known from literature is to combine different radar images from different positions similar to stereo and structure from motion approaches in computer vision. In this paper we present a detailed investigation of the latter approach. We examine the correspondence problem with regard to the special geometry of radar sensors as well as the principle methodology to estimate 3D information from different rage angle measurements. The main part of the contribution deals with the question of accuracy: What accuracy can be obtained? What are the influences of factors like vertical beam width range and angular resolution of the sensor relative transformation between different sensor locations, etc. Based on this investigation, we introduce a new approach for vertical positioning. As a special benefit this methods delivers a measurement of validity which allows the judgement of the estimation of the relative vertical position from sensor to target. The performance of our approach is demonstrated with both simulated data and real data acquired during flight tests.


Enhanced and Synthetic Vision 1999 | 1999

MMW radar data processing for enhanced vision

Bernd Korn; Hans-Ullrich Doehler; Peter Hecker

Comprehensive situation awareness is very important for aircrews to handle complex situations like landing approaches or taxiing, especially under adverse weather conditions. Thus, DLRs Institute of Flight Guidance is developing an Enhanced Vision System that uses different forward looking imaging sensors to gain information needed for executing given tasks. Furthermore, terrain models, if available, can be used to control as well as to support the sensor data processing. Up to now, the most promising sensor due to its lowest weather dependency compared to other imaging sensors seems to be a 35 GHz MMW radar from DASA, Ulm, which provides range data with a frame rate of about 16 Hz. In previous contributions first experimental results of our radar data processing have been presented. In this paper we deal with radar data processing in more detail. Automatic extraction of relevant features for landing approaches and taxiing maneuvers will be focused. In the first part of this contribution we describe a calibration of the MMW radar which is necessary to determine the exact relationship between raw sensor data (pixels) and world coordinates. Furthermore, a calibration gives us an idea how accurate features can be located in the world. The second part of this paper is about our approach for automatically extracting features relevant for landing and taxiing. Improvements of spatial resolution as well as noise reduction are achieved with a multi frame approach. The correspondence of features in different frames is found with the aid of navigation sensors like INS or GPS, but can also be done by tracking methods. To demonstrate the performance of our approach we applied the extraction method on simulated data as well as on real data. The real data have been acquired using a test van and a test aircraft, both equipped with a prototype of the imaging MMW Radar from DASA, Ulm.


Enhanced and Synthetic Vision 1999 | 1999

Enhanced vision meets pilot assistance

Peter Hecker; Hans-Ullrich Doehler; Reiner Suikat

As presented in previous contributions within SPIEs Enhanced and Synthetic Vision Conferences, DLRs Institute of Flight Guidance is involved in the design, development and testing of enhanced vision systems for flight guidance applications. The combination of forward looking imaging sensors (such as DaimlerChryslers HiVision millimeter wave radar), terrain data stored in on-board databases plus information transmitted from ground or on-board other aircraft via data link is used to give the air crew an improved situational awareness. This helps pilots to handle critical tasks, such as landing approaches and taxiing especially under adverse weather conditions. The research and development of this system was mostly funded by a national research program from mid of 1996 to mid of 1999. On one hand this paper will give a general overview about the project and the lessons learned. Results of flight tests carried out recently will be shown as well as brief looks into evaluation tests on-board an Airbus A-340 full flight simulator performed mid of 1998 at the Flight Simulation Center Berlin. On the other hand an outlook will be presented, which shows enhanced vision systems as a major player in the theater of pilot assistance systems as they are under development at DLRs Institute of Flight Guidance in close cooperation with the University of the Federal Armed Forces in Munich, Germany.


Enhanced and synthetic vision. Conference | 2002

Generic experimental cockpit for evaluating pilot assistance systems

Helmut H. Toebben; Hans-Ullrich Doehler; Peter Hecker

The workload of aircraft crews, especially during taxiing, take-off, approach and landing under adverse weather conditions has heavily increased due to the continuous growth of air traffic. New pilot assistance systems can improve the situational awareness of the aircrew and consequently increase the safety and reduce the workload. For demonstration and human factor evaluation of such new systems the DLR has built a Generic Experimental Cockpit Simulator equipped with a modern glass-cockpit collimated display. The Primary Flight Display (PFD), the human machine interface for an Advanced Flight Management System (AFMS), a Taxi Guidance System called Taxi and Ramp Management and Control (TARMAC) and an Enhanced Vision System (EVS) based on real time simulation of MMWR and FLIR sensors are integrated into the cockpit on high resolution TFT touch screens. The situational awareness is further enhanced by the integration of a raster/stroke capable Head-Up Display (HUD). It prevents the pilots eye from permanent accommodation between the Head-Down Displays and the outside view. This contribution describes the technical implementation of the PFD, the Taxi Guidance System and the EVS onto the HUD. The HUD is driven by a normal PC, which provides the Arinc data for the stroke generator and the video signal for the raster image. The PFD uses the built-in stroke generator and is working under all operations. During taxi operations the cleared taxi route and the positions of other aircraft are displayed via raster. The images of the real time simulation of the MMWR and FLIR Sensors are presented via raster on demand. During approach and landing a runway symbol or a 3D wire frame database is shown which exactly matches the outside view and obstacles on the runway are highlighted. The runway position is automatically calculated from the MMWR Sensor as reported in previous contributions.


Proceedings of SPIE | 2001

Navigation integrity monitoring and obstacle detection for enhanced-vision systems

Bernd Korn; Hans-Ullrich Doehler; Peter Hecker


Proceedings of SPIE | 1998

Enhanced vision systems: results of simulation and operational tests

Peter Hecker; Hans-Ullrich Doehler


Proceedings of SPIE | 2001

Extending enhanced-vision capabilities by integration of advanced surface movement guidance and control systems (A-SMGCS)

Peter Hecker; Hans-Ullrich Doehler; Bernd Korn; Thomas Ludwig


AIAA International Air and Space Symposium and Exposition: The Next 100 Years | 2003

Increasing the Aircrew's Situation Awareness by Enhanced and Synthetic Vision

Peter Hecker; Bernd Korn


Archive | 2005

Passive “Radar-PAPI” Landing Aids for precision straight-in Approach and Landing in low visibility

Bernd Korn; Bernd Lorenz; Hans-Ullrich Doehler; Helmut Toebben; Peter Hecker

Collaboration


Dive into the Peter Hecker's collaboration.

Top Co-Authors

Avatar

Bernd Korn

German Aerospace Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bernd Lorenz

German Aerospace Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge