Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Semion Kizhner is active.

Publication


Featured researches published by Semion Kizhner.


ieee aerospace conference | 2006

On certain theoretical developments underlying the Hilbert-Huang transform

Semion Kizhner; Karin Blank; Thomas P. Flatley; Norden E. Huang; David J. Petrick; Phyllis Hestnes

One of the main traditional tools used in scientific and engineering data spectral analysis is the Fourier integral transform and its high performance digital equivalent - the fast Fourier transform (FFT). Both carry strong a-priori assumptions about the source data, such as being linear and stationary, and of satisfying the Dirichlet conditions. A recent development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang transform (HHT), proposes a novel approach to the solution for the nonlinear class of spectral analysis problems. Using a-posteriori data processing based on the empirical mode decomposition (EMD) sifting process (algorithm), followed by the normalized Hilbert transform of the decomposed data, the HHT allows spectral analysis of nonlinear and nonstationary data. The EMD sifting process results in a non-constrained decomposition of a source numerical data vector into a finite set of intrinsic mode functions (IMF). These functions form a nearly orthogonal, derived from the data basis (adaptive basis). The IMFs can be further analyzed for spectrum content by using the classical Hilbert Transform. A new engineering spectral analysis tool using HHT has been developed at NASA GSFC, the HHT data processing system (HHT-DPS). As the HHT-DPS has been successfully used and commercialized, new applications pose additional questions about the theoretical basis behind the HHT EMD algorithm. Why is the fastest changing component of a composite signal being sifted out first in the EMD sifting process? Why does the EMD sifting process seemingly converge and why does it converge rapidly? Does an IMF have a distinctive structure? Why are the IMFs nearly orthogonal? We address these questions and develop the initial theoretical background for the HHT. This will contribute to the development of new HHT processing options, such as real-time and 2D processing using field programmable gate array (FPGA) computational resources, enhanced HHT synthesis, and will broaden the scope of HHT applications for signal processing


ieee aerospace conference | 2004

On the Hilbert-Huang transform data processing system development

Semion Kizhner; Thomas P. Flatley; Norden E. Huang; Karin Blank; Evette Conwell

One of the main heritage tools used in scientific and engineering data spectrum analysis is the Fourier Integral Transform and its high performance digital equivalent - the fast Fourier transform (FFT). The Fourier view of nonlinear mechanics that had existed for a long time, and the associated FFT (fairly recent development), carry strong a-priori assumptions about the source data, such as linearity and of being stationary. Natural phenomena measurements are essentially nonlinear and nonstationary. A development at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC), known as the Hilbert-Huang transform (HHT) proposes an approach to the solution for the nonlinear class of spectrum analysis problems. Using the empirical mode decomposition (EMD) followed by the Hilbert transform of the empirical decomposition data (HT) as stated in N.E. Huang et al. (1998), N. E. Huang (1999), and N. E. Huang (2001), the HHT allows spectrum analysis of nonlinear and nonstationary data by using an engineering a-posteriori data processing, based on the EMD algorithm. This results in a non-constrained decomposition of a source real value data vector into a finite set of intrinsic mode functions (IMF) that can be further analyzed for spectrum interpretation by the classical Hilbert transform. This paper describes phase one of the development of a new engineering tool, the HHT data processing system (HHTDPS). The HHTDPS allows applying the HHT to a data vector in a fashion similar to the heritage FFT. It is a generic, low cost, high performance personal computer (PC) based system that implements the HHT computational algorithms in a user friendly, file driven environment. This paper also presents a quantitative analysis for a composite waveform data sample, a summary of technology commercialization efforts and the lessons learned from this new technology development.


ieee aerospace conference | 2007

On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework

Semion Kizhner; Umeshkumar D. Patel; Meg Vootukuru

Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segments, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpacelP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpacelP-enabled instrument components will largely determine the SpacelP utilization of those investments and acceptance in years to come. Likewise SpacelP, the development of commercial real-time and instrument co-located computational resources, data compression and storage, can be enabled on-board a spacecraft instrument and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. These are presently only co-located with the spacecraft Command and Data Handling System (C&DH). Sensor Web-enabled re-configuration and adaptation of structures for hardware resources and information systems on instrument level will commence application of Field Programmable Gate Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications on instrument level. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative Instrument Sensor Web (ISW). This concept widens the scope of heritage sensor webs and facilitates the application of sensor web technologies to complex representative instruments onboard future spacecrafts.


adaptive hardware and systems | 2008

On Convergence of Development Costs and Cost Models for Complex Spaceflight Instrument Electronics

Semion Kizhner; Umeshkumar D. Patel; Robert L. Kasa; Phyllis Hestnes; Tammy Brown; Madhavi Vootukuru

Development costs of a few recent spaceflight instrument electrical and electronics subsystems have diverged from respective heritage cost model predictions. The cost models used are grass roots, price-H [1] and parametric model [3]. These cost models originated in the military and industry around 1970 and were successfully adopted and patched by NASA on a mission-by-mission basis for years [2]. However, the complexity of new instruments recently changed rapidly by orders of magnitude. This is most obvious in the complexity of representative spaceflight instrument electronicspsila data system. It is now required to perform intermediate processing of digitized data apart from conventional processing of science phenomenon signals from multiple detectors. This involves onboard instrument formatting of computational operands from row data (for example, images), multi-million operations per second on large volumes of data in reconfigurable hardware (in addition to processing on a general purpose embedded or stand-alone instrument flight computer), as well as making decisions for onboard system adaptation and resource reconfiguration. The conflict between the actual development cost of newer complex instruments and its electronics componentspsila heritage cost model predictions seems to be irreconcilable. This conflict and an approach to its resolution are addressed in this paper by determining the complexity parameters, complexity index, and their use in enhanced cost model. It is expected to facilitate farther enhancements to existing cost models, resulting in smaller difference (convergence) between the electronicspsila developmental and model predicted costs.


Optical Tools for Manufacturing and Advanced Automation | 1993

Capaciflector sensor imaging and ranging applications for robot control

Semion Kizhner

One basic robotics task is to position a robot tool point over the geometric center of a symmetric object. This paper presents a fully autonomous control command development technique of linear complexity that solves the robot tool point centering and alignment problem. It is based on non-visual sensor imaging and starts with only a partial image of the target in its field of view. A new capacitive proximity and imaging sensor, called a capaciflector, is being used to obtain 2D discrete images of objects with moderate surface complexity. Considered are spacecraft parts whose imaged surfaces are flat with a simple geometrical shape such as a solid rectangle or a circle.


adaptive hardware and systems | 2010

On DESTINY instrument electrical and electronics subsystem framework

Semion Kizhner; Dominic J. Benford; Tod R. Lauer

Future space missions are going to require a few large focal planes with many sensing arrays and hundreds of millions of pixels all read out at high data rates,. This will place unique demands on the electrical and electronics (EE) subsystem design and it will be critically important to have high technology readiness level (TRL) EE concepts ready to support such missions. One such mission is the Joint Dark Energy Mission (JDEM) charged with making precise measurements of expansion rate of the universe to reveal vital clues about the nature of dark energy - a hypothetical form of energy that permeates all of space and tends to increase the rate of expansion. One of three JDEM concept studies - the Dark Energy Space Telescope (DESTINY) was conducted in 2008 at the NASAs Goddard Space Flight Center (GSFC) in Greenbelt, Maryland. This paper presents the EE subsystem framework, which evolved from the DESTINY science instrument study. It describes the main challenges and implementation concepts related to the design of an EE subsystem featuring multiple focal planes populated with dozens of large arrays and millions of pixels. The focal planes are passively cooled to cryogenic temperatures (below 140 K). The sensor mosaic is controlled by a large number of Readout Integrated Circuits and Application Specific Integrated Circuits - the ROICs/ASICs in near proximity to their sensor focal planes. The ASICs, in turn, are serviced by a set of “warm” EE subsystem boxes performing Field Programmable Gate Array (FPGA) based digital signal processing (DSP) computations of complex algorithms, such as sampling-up-the-ramp algorithm (SUTR), over large volumes of fast data streams. The SUTR boxes are supported by the Instrument Control/Command and Data Handling box (ICDH Primary and Backup boxes) for lossless data compression, command and low volume telemetry handling, power conversion and for communications with the spacecraft. This paper outlines how the JDEM DESTINY instrument EE subsystem can be built now, a design that is generally applicable to a wide variety of missions using large focal planes with large mosaics of sensors.


adaptive hardware and systems | 2012

On development of Hilbert-Huang Transform data processing real-time system with 2D capabilities

Semion Kizhner; Karin Blank; Jennifer A. Sichler; Umeshkumar D. Patel; Jacqueline Le Moigne; Esam El-Araby; Vinh Dang

Unlike other digital signal processing techniques such as the Fast Fourier Transform for one-dimensional (1D) and two-dimensional (2D) data (FFT1 and FFT2) that assume signal linearity and stationarity, the Hilbert-Huang Transform (HHT) utilizes relationships between arbitrary signals local extrema to find the signal instantaneous spectral representation. This is done in two steps. Firstly, the Huang Empirical Mode Decomposition (EMD) is separating input signal of one variable s(t) into a finite set of narrow-band Intrinsic Mode Functions {IMF1(t), IMF2(t)... IMFk(t)} that add up to the signal s(t). The IMFs comprise the signal adaptive basis that is derived from the signal, as opposed to artificial basis imposed by the FFT or other heritage frequency analysis methods. Secondly, the HHT is applying the Hilbert Transform to each IMFi(t) signal constituents to obtain the corresponding analytical signal Si(t). From the analytical signal the HHT generates the Hilbert-Huang Spectrum. Namely, a single instantaneous frequency ωi(t) for signal Si(t) at each argument t is obtained for each of the k-Huang IMFs. This yields the Hilbert-Huang spectrum {ω(IMF1(t)), ω(IMF2(t))... ω(IMFk(t))} at each domain argument t for s(t) that was not obtainable otherwise. The HHT and its engineering implementation - the HHT Data Processing System (HHT-DPS) for 1D was developed at the NASA Goddard Space Flight Center (GSFC). The HHT-DPS is the reference system now used around the world. However, the state-of-the-art HHT-DPS works only for 1D data, as designed, and it is not a real-time system. This paper describes the development of the reference HHT Data Processing Real-Time System (HHTPS-RT) with 2D capabilities or HHT2 to process large images as the development goal. This paper describes the methodology of research and development of the new reference HHT2 Empirical Mode Decomposition for 2D (EMD2) system and its algorithms that require high capability computing. It provides this system prototype test results and also introduces the HHT2 spectrum concepts. It concludes with suggested areas for future research.


ieee aerospace conference | 2010

On DESTINY science instrument electrical and electronics subsystem framework

Semion Kizhner; Dominic J. Benford; Tod R. Lauer

Future space missions are going to require a few large focal planes with many sensing arrays and hundreds of millions of pixels all read out at high data rates1,2. This will place unique demands on the electrical and electronics (EE) subsystem design and it will be critically important to have high technology readiness level (TRL) EE concepts ready to support such missions. One such mission is the Joint Dark Energy Mission (JDEM) charged with making precise measurements of expansion rate of the universe to reveal vital clues about the nature of dark energy - a hypothetical form of energy that permeates all of space and tends to increase the rate of expansion. One of three JDEM concept studies - the Dark Energy Space Telescope (DESTINY) was conducted in 2008 at the NASAs Goddard Space Flight Center (GSFC) in Greenbelt, Maryland. This paper presents the EE subsystem framework, which evolved from the DESTINY science instrument study. It describes the main challenges and implementation concepts related to the design of an EE subsystem featuring multiple focal planes populated with dozens of large arrays and millions of pixels. The focal planes are passively cooled to cryogenic temperatures (below 140 K). The sensor mosaic is controlled by a large number of Readout Integrated Circuits and Application Specific Integrated Circuits - the ROICs/ASICs in near proximity to their sensor focal planes. The ASICs, in turn, are serviced by a set of “warm” EE subsystem boxes performing Field Programmable Gate Array (FPGA) based digital signal processing (DSP) computations of complex algorithms, such as sampling-up-the-ramp algorithm (SUTR), over large volumes of fast data streams. The SUTR boxes are supported by the Instrument Control/Command and Data Handling box (ICDH Primary and Backup boxes) for lossless data compression, command and low volume telemetry handling, power conversion and for communications with the spacecraft. This paper outlines how the JDEM DESTINY instrument EE subsystem can be built now, a design that is generally applicable to a wide variety of missions using large focal planes with large mosaics of sensors.


adaptive hardware and systems | 2009

New Methodology for Reducing Sensor and Readout Electronics Circuitry Noise in Digital Domain

Semion Kizhner; Katherine Heinzen

Upcoming NASA cosmology survey missions, such as Joint Dark Energy Mission (JDEM), carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated or non-sensitive to photons reference pixels, which can be used to reduce noise attributed to sensor and readout electronics. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels. These methods involve using spatial and temporal global statistical scalar parameters derived from boundary reference pixel information to enhance the active pixels’ signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert-Huang Transform Data Processing System (HHT-DPS) to some component of reference pixel vectors’ information. This allows to derive a noise correction array, which, in addition to the statistical parameter over the signal trend, is applied to the active pixel array.


Proceedings of SPIE | 2004

Solar viewing interferometer prototype

Richard G. Lyon; Jay R. Herman; Nader Abuhassan; Catherine T. Marx; Semion Kizhner; Julie A. Crooke; Ronald W. Toland; Albert Mariano; Cheryl Salerno; Gary Brown; Tony Cazeau; Peter Petrone; Billy Mamakos; Severine C. Tournois

The Earth Atmospheric Solar-Occultation Imager (EASI) is a proposed interferometer with 5 telescopes on an 8-meter boom in a 1D Fizeau configuration. Placed at the Earth-Sun L2 Lagrange point, EASI would perform absorption spectroscopy of the Earth’s atmosphere occulting the Sun. Fizeau interferometers give spatial resolution comparable to a filled aperture but lower collecting area. Even with the small collecting area the high solar flux requires most of the energy to be reflected back to space. EASI will require closed loop control of the optics to compensate for spacecraft and instrument motions, thermal and structural transients and pointing jitter. The Solar Viewing Interferometry Prototype (SVIP) is a prototype ground instrument to study the needed wavefront control methods. SVIP consists of three 10 cm aperture telescopes, in a linear configuration, on a 1.2-meter boom that will estimate atmospheric abundances of O2, H2O, CO2, and CH4 versus altitude and azimuth in the 1.25 - 1.73 micron band. SVIP measures the Greenhouse Gas absorption while looking at the sun, and uses solar granulation to deduce piston, tip and tilt misalignments from atmospheric turbulence and the instrument structure. Tip/tilt sensors determine relative/absolute telescope pointing and operate from 0.43 - 0.48 microns to maximize contrast. Two piston sensors, using a robust variation of dispersed fringes, determine piston shifts between the baselines and operate from 0.5 - 0.73 microns. All sensors are sampled at 800 Hz and processed with a DSP computer and fed back at 200 Hz (3 dB) to the active optics. A 4 Hz error signal is also fed back to the tracking platform. Optical performance will be maintained to better than λ/8 rms in closed-loop.

Collaboration


Dive into the Semion Kizhner's collaboration.

Top Co-Authors

Avatar

Karin Blank

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar

Thomas P. Flatley

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar

Phyllis Hestnes

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar

Norden E. Huang

National Central University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Dominic J. Benford

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David J. Petrick

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge