Valerij Tchernykh
Dresden University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Valerij Tchernykh.
international conference on advanced intelligent mechatronics | 2005
Klaus Janschek; Valerij Tchernykh; Serguei Dyblenko
This paper presents the concept of a smart satellite pushbroom imaging system with internal compensation of attitude instability effects. The compensation is performed within the optical path by an active opto-mechatronic stabilization of the focal plane image motion in a closed loop system with visual feedback. Residual distortions are corrected by image deblurring through deconvolution. Both corrective actions are derived from a real-time image motion measurement which is based on an auxiliary matrix image sensor and an onboard optical correlator. The paper describes the principles of operation, the main system elements and gives detailed performance figures derived from a simulation performance model, which contains all relevant components of the smart imaging system
Remote Sensing | 2004
Valerij Tchernykh; Serguei Dyblenko; Klaus Janschek; Klaus Seifart; Bernd Harnisch
Smart pushbroom imaging system (SMARTSCAN) solves the problem of image correction for satellite pushbroom cameras which are disturbed by satellite attitude instability effects. Satellite cameras with linear sensors are particularly sensitive to attitude errors, which cause considerable image distortions. A novel solution of distortions correction is presented, which is based on the real-time recording of the image motion in the focal plane of the satellite camera. This allows using such smart pushbroom cameras (multi-/hyperspectral) even on moderately stabilised satellites, e.g. small sats, LEO comsats. The SMARTSCAN concept uses in-situ measurements of the image motion with additional CCD-sensors in the focal plane and real-time image processing of these measurements by an onboard Joint Transform Optical Correlator. SMARTSCAN has been successfully demonstrated with breadboard models for the Optical Correlator and a Smart Pushbroom Camera at laboratory level (satellite motion simulator on base of a 5 DOF industrial robot) and by an airborne flight demonstration in July 2002. The paper describes briefly the principle of operation of the system and gives a description of the hardware model are provided. Detailed results of the airborne tests and performance analysis are given as well as detailed tests description.
intelligent robots and systems | 2006
Valerij Tchernykh; Martin Beck; Klaus Janschek
The conceptual design of an embedded high performance opto-electronic optical flow processor is presented, which is designed for navigation applications in the field of robotics (ground, aerial, marine) and space (satellites, landing vehicles). It is based on 2D fragment image motion determination by 2D correlation. To meet the real-time performance requirements the principle of joint transform correlation (JTC) and advanced optical correlator technology is used. The paper recalls briefly the underlying principles of optical flow computation and optical correlation, it shows the system layout and the conceptual design for the optical flow processor and it gives preliminary performance results based on a high fidelity simulation of the complete optical processing chain
Algorithms and Systems for Optical Information Processing IV | 2000
Valerij Tchernykh; Sergey V. Dyblenko; Klaus Janschek; Bernd Harnisch
The paper describes the design concept of an optoelectronic system for real time image motion analysis. The system is proposed to be used onboard an Earth observation satellite for the real time recording of the image motion in the focal plane of the camera. With this record available, it is possible to use pushbroom scan cameras onboard satellites with moderate attitude stability (possible geometric distortions, caused by the attitude instability, can be corrected posteirori on base of the records). New experimental results are presented, which have been derived from a real-time breadboard model of the optical processor, developed and manufactured under ESA- contract. The results of the tests are provided as well as the expected performances of a full scale system.
Remote Sensing | 2004
Klaus Janschek; Valerij Tchernykh; Serguei Dyblenko; Grégory Flandin; Bernd Harnisch
The paper presents a concept of a smart pushbroom imaging system with compensation of attitude instability effects. The compensation is performed by active opto-mechatronic stabilization of the focal plane image motion in a closed loop system with visual feedback on base of an auxiliary matrix image sensor and an onboard optical correlator. In this way the effects of the attitude instability, vibrations and micro shocks can be neutralized, the image quality improved and the requirements to satellite attitude stability reduced. To prove the feasibility and to estimate the effectiveness of the image motion stabilization, a performance model of the smart imaging system has been developed and a simulation experiment has been carried out. The description of the performance model and the results of the simulation experiment are also given.
International Symposium on Optical Science and Technology | 2002
Valerij Tchernykh; Sergey V. Dyblenko; Klaus Janschek; Wolgang Goehler; Bernd Harnisch
The paper describes the test results of the hardware model of a smart pushbroom imaging system. The imaging system can be used on satellites with moderately attitude stability due to application of the image correction on base of the real-time image motion record by an optoelectronic image processor and auxiliary sensors in the focal plane. The tested model includes the breadboard model of a smart pushbroom camera with auxiliary sensors, the optoelectronic processor model and the image correction software. The tests have been performed on a laboratory satellite motion simulator based on a 5 DOF industrial robot. Numerical values of the image motion record accuracy and the image correction efficiency are given as well as a detailed test description.
At-automatisierungstechnik | 2005
Klaus Janschek; Valerij Tchernykh; Serguei Dyblenko; Grégory Flandin
Abstract In this article a new concept of a smart satellite pushbroom imaging system with internal compensation of attitude instability effects is presented. The compensation is performed within the optical path by an active opto-mechatronic stabilization of the focal plane image motion in a closed loop system with visual feedback. The real-time image motion measurement is derived from an auxiliary matrix image sensor and an onboard optical correlator. In this way the effects of attitude instability, vibrations and micro shocks can be neutralized, the image quality is improved and the requirements to the satellite attitude stability can be reduced considerably. The paper describes the principles of operation, the main system elements and gives detailed performance figures derived from a simulation performance model, which contains all relevant components of the smart imaging system.
Optical technologies for communications. Conference | 2004
Sergey V. Dyblenko; Klaus Janschek; Anton E. Kisselev; Albert H. Sultanov; Valerij Tchernykh
Information system determining the satellite navigation parameters on the base of landmark image processing is considered. The concept of the optoelectronic navigation is based on the onboard optical correlator application for real time matching of the Earth images and prerecorded images of landmarks with known coordinates. The system is suitable for the low-orbit Earth imaging missions with 3-axis attitude stabilization and can be used for backup landmark navigation, precision pointing of the imaging payload and for onboard georeferencing of the obtained images. Mathematical model of the optoelectronic landmark navigation is considered. Compact optics design, the software and hardware models of the joint transform optical correlator have been developed. Experimental results obtained by using the image processing system are represented. The effects of the current image distortions on correlator performance were investigated. In the series of simulated experiments the accuracy of images matching was estimated in presence of image distortions and noise typical for high resolution Earth observation mission. The possibility to obtain the sub-pixel accuracy of images matching in real conditions under noisy environment is shown.
international conference on intelligent transportation systems | 2003
Reinhart Kuhne; Carsten Dalaff; Martin Ruhé; Thomas Rupp; Ludger Froebel; Klaus Janschek; Valerij Tchernykh; Peter Behr
MultiSat WebService is a space borne service concept primarily to support, extend or substitute information services for mobility and traffic purposes. It allows the determination of traffic data from space on a global and near-real-time scale. Main objective is to provide a profitable service for mobility and traffic management. A market survey being made shows that space borne online information services may be viable. The service provides the possibility to receive pre-processed, near-real-time Earth surface data with E-commerce compatible methods. The system design gives the opportunity to freely configure the space system according to customers needs. The MultiSat infrastructure design features a satellite constellation with imaging synthetic aperture radar (SAR) and optical payloads combined with low-rate communication especially established to support this service. Also included is a scalable, fault tolerant, multi-computer system. The development cycle focuses on an airborne demonstration of the service idea as a first milestone. The MultiSat WebService concept is being created and designed by a consortium consisting of German Aerospace Center (DLR), Technische Universitat Dresden and Fraunhofer Gesellschaft FIRST and presented here as a visionary feasibility study.
Archive | 2007
Valerij Tchernykh; Martin Beck; Klaus Janschek
Autonomous visual navigation, i.e. determination of position, attitude and velocity (ego motion) by processing of the images from onboard camera(s), is essential for mobile robots control even in the presence of GPS networks, as the accuracy of GPS data and/or the available map of surroundings can be insufficient. Besides, GPS signals reception can be unstable in many locations (inside buildings, tunnels, in narrow streets, canyons, under trees, etc). Up to now most of the practical visual navigation solutions have been developed for ground robots moving in cooperative and/or well determined environment. However, future generations of mobile robots should be also capable of operating in complex and noncooperative 3D environments. Visual navigation in such conditions is much more challenging, especially for flying robots, where full 6DOF pose/motion should be determined. Generally 3D environment perception is required in this case, i.e., determination of a local depth map for the visible scene. 3D scene information can be obtained by stereo imaging; however this solution has certain limitations. It requires at least two cameras, precisely mounted with a certain stereo base (can be critical for small vehicles). Due to fixed stereo base the range of the depth determination with stereo imaging is limited. A more universal solution with less hardware requirements can be achieved with optical flow processing of sequential images from a single onboard camera. The ego motion of a camera rigidly mounted on a vehicle is mapped into the motion of image pixels in the camera focal plane. This image motion is commonly understood as image flow or optical flow (OF) (Horn & Schunck, 1981). This vector field of 2D image motion can be used efficiently for 3D environment perception (mapping) and vehicle pose/motion determination as well as for obstacle avoidance or visual servoing. The big challenge for using the optical flow in real applications is its computability in terms of its density (sparse vs. dense optical flow), accuracy, robustness to dark and noisy images and its real-time determination. The general problem of optical flow determination can be formulated as the extraction of the two-dimensional projection of the 3D relative motion into the image plane in form of a field of correspondences (motion vectors) between points in consecutive image frames.