Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Thomas Lueken is active.

Publication


Featured researches published by Thomas Lueken.


Proceedings of SPIE | 2009

ALLFlight - A full scale enhanced and synthetic vision sensor suite for helicopter applications

Hans-Ullrich Doehler; Thomas Lueken; R. Lantzsch

In 2008 the German Aerospace Center (DLR) started the project ALLFlight (Assisted Low Level Flight and Landing on Unprepared Landing Sites). This project deals with the integration of a full scale enhanced vision sensor suite onto the DLRs research helicopter EC135. This sensor suite consists of a variety of imaging sensors, including a color TV camera and an un-cooled thermal infrared camera. Two different ranging sensors are also part of this sensor suite: an optical radar scanner and a millimeter wave radar system. Both radar systems are equipped with specialized software for experimental modes, such as terrain mapping and ground scanning. To be able to process and display the huge incoming flood of data from these sensors, a compact high performance sensor co-computer system (SCC) has been designed and realized, which can easily be installed into the helicopters cargo bay. A sophisticated, high performance, distributed data acquisition, recording, processing, and fusion software architecture has been developed and implemented during the first project year. The paper describes the challenging mechanical integration of such a comprehensive sensor suite onto the EC135 and explains the architectural hard- and software concept and the implementation on the SCC.


ieee/aiaa digital avionics systems conference | 2008

Lidar simulation using graphics hardware acceleration

Niklas Peinecke; Thomas Lueken; Bernd Korn

Modern enhanced and synthetic vision systems (EVS/SVS) often make use of the fusion of multi-sensor data. Thus there is a demand for simulated sensor data in order to test and evaluate those systems at an early stage. We describe an approach for simulating Lidar sensors based on using modern computer graphics hardware making heavy use of recent technologies like vertex and fragment shaders. This approach has been successfully used for simulating millimeter wave radar sensors before. It is shown that a multi sensor simulation suite integrating such different sensors like millimeter wave radar, Lidar or infrared can be realized using principally similar software techniques thus allowing for a unified comprehensive simulator. This approach allows us to use a single consistent data base for multi sensor fusion. Recent graphics hardware offers the possibility of carrying out a variety of tasks in the graphical processing unit (GPU) as opposed to the traditional approach of doing most computations in the computerpsilas CPU. Using vertex and fragment shaders makes these tasks particularly easy. We present a vertex shader solution written in GLSL, the OpenGL shading language. The program computes all necessary view transformations and shading information necessary for Lidar simulation in one pass. This allows high frame rates for real time simulations of even complex flight scenes.


Proceedings of SPIE | 2014

Visual-conformal display format for helicopter guidance

Hans-Ullrich Doehler; Sven Schmerwitz; Thomas Lueken

Helicopter guidance in situations where natural vision is reduced is still a challenging task. Beside new available sensors, which are able to “see” through darkness, fog and dust, display technology remains one of the key issues of pilot assistance systems. As long as we have pilots within aircraft cockpits, we have to keep them informed about the outside situation. “Situational awareness” of humans is mainly powered by their visual channel. Therefore, display systems which are able to cross-fade seamless from natural vision to artificial computer vision and vice versa, are of greatest interest within this context. Helmet-mounted displays (HMD) have this property when they apply a head-tracker for measuring the pilot’s head orientation relative to the aircraft reference frame. Together with the aircraft’s position and orientation relative to the world’s reference frame, the on-board graphics computer can generate images which are perfectly aligned with the outside world. We call image elements which match the outside world, “visual-conformal”. Published display formats for helicopter guidance in degraded visual environment apply mostly 2D-symbologies which stay far behind from what is possible. We propose a perspective 3D-symbology for a head-tracked HMD which shows as much as possible visual-conformal elements. We implemented and tested our proposal within our fixed based cockpit simulator as well as in our flying helicopter simulator (FHS). Recently conducted simulation trials with experienced helicopter pilots give some first evaluation results of our proposal.


Optical Engineering | 2017

Conformal Displays: Human Factors Analysis of Innovative Landing Aids

Sven Schmerwitz; Thomas Lueken; Hans-Ullrich Doehler; Niklas Peinecke; Johannes M. Ernst; David L. da Silva Rosa

Abstract. In the past couple of years, research on display content for helicopter operations headed in a new direction. The already reached goals could evolve into a paradigm change for information visualization. Technology advancements allow implementing three-dimensional and conformal content on a helmet-mounted see-through device. This superimposed imagery inherits the same optical flow as the environment. It is supposed to ease switching between display information and environmental cues. The concept is neither pathbreaking nor new, but it has not been successfully established in aviation yet. Nevertheless, there are certainly some advantages to expect—at least from perspective of a human-centered system design. Within the following pages, the next generation displays will be presented and discussed with a focus on human factors. Beginning with recalling some human factor related research facts, an experiment comparing the former two-dimensional research displays will be presented. Before introducing the DLR conformal symbol set and the three experiments about an innovative drift, indication related research activities toward conformal symbol sets will be addressed.


Proceedings of SPIE | 2016

Helmet mounted display supporting helicopter missions during en route flight and landing

Thomas Lueken; Hans-Ullrich Doehler; Sven Schmerwitz

Degraded visual environment is still a major problem for helicopter pilots especially during approach and landing. Particularly with regard to the landing phase, pilot’s eyes must be directed outward in order to find visual cues as indicators for drift estimation. If lateral speed exceeds the limits it can damage the airframe or in extreme cases lead to a rollover. Since poor visibility can contribute to a loss of situation awareness and spatial disorientation, it is crucial to intuitively provide the pilot with the essential visual information he needs for a safe landing. With continuous technology advancement helmet-mounted displays (HMD) will soon become a spreading technology, because look through capability is an enabler to offer monitoring the outside view while presenting flight phase depending symbologies on the helmet display. Besides presenting primary flight information, additional information for obstacle accentuation or terrain visualization can be displayed on the visor. Virtual conformal elements like 3D pathway depiction or a 3D landing zone representation can help the pilot to maintain control until touchdown even during poor visual conditions. This paper describes first investigations in terms of both en route and landing symbology presented on a helmet mounted display system in the scope of helicopter flight trials with DLR’s flying helicopter simulator ACT/FHS.


Proceedings of SPIE | 2017

Designing a virtual cockpit for helicopter offshore operations

Johannes M. Ernst; Sven Schmerwitz; Thomas Lueken; Lars Ebrecht

In recent years the number of offshore wind farms is rapidly increasing. Especially coastal European countries are building numerous offshore wind turbines in the Baltic, the North, and the Irish Sea. During both construction and operation of these wind farms, many specially-equipped helicopters are on duty. Due to their flexibility, their hover capability, and their higher speed compared to ships, these aircraft perform important tasks like helicopter emergency medical services (HEMS) as well as passenger and freight transfer flights. The missions often include specific challenges like platform landings or hoist operations to drop off workers onto wind turbines. However, adverse weather conditions frequently limit helicopter offshore operations. In such scenarios, the application of aircraft-mounted sensors and obstacle databases together with helmet-mounted displays (HMD) seems to offer great potential to improve the operational capabilities of the helicopters used. By displaying environmental information in a visual conformal manner, these systems mitigate the loss of visual reference to the surroundings. This helps the pilots to maintain proper situational awareness. This paper analyzes the specific challenges of helicopter offshore operations in wind parks by means of an online survey and a structured interview with pilots and operators. Further, the work presents how our previously introduced concept of an HMD-based virtual flight deck could enhance helicopter offshore missions. The advantages of this system – for instance its “see-through the airframe”-capability and its highly-flexible cockpit setup – enable us to design entirely novel pilot assistance systems. The gained knowledge will be used to develop a virtual cockpit that is tailor-made for helicopter offshore maneuvers


Proceedings of SPIE | 2016

Amplifying the helicopter drift in a conformal HMD

Sven Schmerwitz; Patrizia Knabl; Thomas Lueken; Hans-Ullrich Doehler

Helicopter operations require a well-controlled and minimal lateral drift shortly before ground contact. Any lateral speed exceeding this small threshold can cause a dangerous momentum around the roll axis, which may cause a total roll over of the helicopter. As long as pilots can observe visual cues from the ground, they are able to easily control the helicopter drift. But whenever natural vision is reduced or even obscured, e.g. due to night, fog, or dust, this controllability diminishes. Therefore helicopter operators could benefit from some type of “drift indication” that mitigates the influence of a degraded visual environment. Generally humans derive ego motion by the perceived environmental object flow. The visual cues perceived are located close to the helicopter, therefore even small movements can be recognized. This fact was used to investigate a modified drift indication. To enhance the perception of ego motion in a conformal HMD symbol set the measured movement was used to generate a pattern motion in the forward field of view close or on the landing pad. The paper will discuss the method of amplified ego motion drift indication. Aspects concerning impact factors like visualization type, location, gain and more will be addressed. Further conclusions from previous studies, a high fidelity experiment and a part task experiment, will be provided. A part task study will be presented that compared different amplified drift indications against a predictor. 24 participants, 15 holding a fixed wing license and 4 helicopter pilots, had to perform a dual task on a virtual reality headset. A simplified control model was used to steer a “helicopter” down to a landing pad while acknowledging randomly placed characters.


Proceedings of SPIE | 2013

ALLFlight: detection of moving objects in IR and ladar images

Hans-Ullrich Doehler; Niklas Peinecke; Thomas Lueken; Sven Schmerwitz

Supporting a helicopter pilot during landing and takeoff in degraded visual environment (DVE) is one of the challenges within DLRs project ALLFlight (Assisted Low Level Flight and Landing on Unprepared Landing Sites). Different types of sensors (TV, Infrared, mmW radar and laser radar) are mounted onto DLR’s research helicopter FHS (flying helicopter simulator) for gathering different sensor data of the surrounding world. A high performance computer cluster architecture acquires and fuses all the information to get one single comprehensive description of the outside situation. While both TV and IR cameras deliver images with frame rates of 25 Hz or 30 Hz, Ladar and mmW radar provide georeferenced sensor data with only 2 Hz or even less. Therefore, it takes several seconds to detect or even track potential moving obstacle candidates in mmW or Ladar sequences. Especially if the helicopter is flying with higher speed, it is very important to minimize the detection time of obstacles in order to initiate a re-planning of the helicopter’s mission timely. Applying feature extraction algorithms on IR images in combination with data fusion algorithms of extracted features and Ladar data can decrease the detection time appreciably. Based on real data from flight tests, the paper describes applied feature extraction methods for moving object detection, as well as data fusion techniques for combining features from TV/IR and Ladar data.


Proceedings of SPIE | 2010

ALLFlight: multisensor data fusion for helicopter operations

Hans-Ullrich Doehler; Thomas Lueken

The objective of the project ALLFlight (Assisted Low Level Flight and Landing on Unprepared Landing Sites) is to demonstrate and evaluate the characteristics of different sensors for helicopter operations within degraded visual environments, such as brownout or whiteout. The sensor suite, which is mounted onto DLRs research helicopter EC135 consists of standard color or black and white TV cameras, an un-cooled thermal infrared camera (EVS-1000, Max-Viz, USA), an optical radar scanner (HELLAS-W, EADS, Germany) and a millimeter wave radar system (AI-130, ICx Radar Systems, Canada). Data processing is designed and realized by a sophisticated, high performance sensor co-computer (SCC) cluster architecture, which is installed into the helicopters experimental electronic cargo bay. This paper describes applied methods and the software architecture in terms of real time data acquisition, recording, time stamping and sensor data fusion. First concepts for a pilot HMI are presented as well.


CEAS Aeronautical Journal | 2012

ALLFlight: tackling the brownout problem

Thomas Lueken; Niklas Peinecke; Hans-Ullrich Doehler; Robin Lantzsch

Collaboration


Dive into the Thomas Lueken's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bernd Korn

German Aerospace Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lars Ebrecht

German Aerospace Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge