Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yitzhak Yitzhaky is active.

Publication


Featured researches published by Yitzhak Yitzhaky.


IEEE Transactions on Pattern Analysis and Machine Intelligence | 2003

A method for objective edge detection evaluation and detector parameter selection

Yitzhak Yitzhaky; Eli Peli

Subjective evaluation by human observers is usually used to analyze and select an edge detector parametric setup when real-world images are considered. We propose a statistical objective performance analysis and detector parameter selection, using detection results produced by different detector parameters. Using the correspondence between the different detection results, an estimated best edge map, utilized as an estimated ground truth (EGT), is obtained. This is done using both a receiver operating characteristics (ROC) analysis and a Chi-square test, and considers the trade off between information and noisiness in the detection results. The best edge detector parameter set (PS) is then selected by the same statistical approach, using the EGT. Results are demonstrated for several edge detection techniques, and compared to published subjective evaluation results. The method developed here suggests a general tool to assist in practical implementations of parametric edge detectors where an automatic process is required.


Journal of The Optical Society of America A-optics Image Science and Vision | 1998

Direct method for restoration of motion-blurred images

Yitzhak Yitzhaky; I. Mor; A. Lantzman; Norman S. Kopeika

We deal with the problem of restoration of images blurred by relative motion between the camera and the object of interest. This problem is common when the imaging system is in moving vehicles or held by human hands, and in robot vision. For correct restoration of the degraded image, it is useful to know the point-spread function (PSF) of the blurring system. We propose a straightforward method to restore motion-blurred images given only the blurred image itself. The method first identifies the PSF of the blur and then uses it to restore the blurred image. The blur identification here is based on the concept that image characteristics along the direction of motion are affected mostly by the blur and are different from the characteristics in other directions. By filtering the blurred image, we emphasize the PSF correlation properties at the expense of those of the original image. Experimental results for image restoration are presented for both synthetic and real motion blur.


Graphical Models and Image Processing | 1997

Identification of blur parameters from motion blurred images

Yitzhak Yitzhaky; Norman S. Kopeika

The problem of restoration of images blurred by relative motion between the camera and the object scene is important in a large number of applications. The solution proposed here identifies important parameters with which to characterize the point spread function (PSF) of the blur, given only the blurred image itself. This identification method is based on the concept that image characteristics along the direction of motion are different from the characteristics in other directions. Depending on the PSF shape, the homogeneity and the smoothness of the blurred image in the motion direction are greater than in other directions. Furthermore, in this direction correlation exists between the pixels forming the blur of the original unblurred objects. By filtering the blurred image we emphasize the PSF characteristics at the expense of the image characteristics. The method proposed here identifies the direction and the extent of the PSF of the blur and evaluates its shape which depends on the type of motion during the exposure. Correct identification of the PSF parameters permits fast high resolution restoration of the blurred image.


SPIE's 1996 International Symposium on Optical Science, Engineering, and Instrumentation | 1996

Identification of blur parameters from motion-blurred images

Yitzhak Yitzhaky; Norman S. Kopeika

A difficult problem in imaging systems is degradation of images caused by motion. This problem is common when the imaging system is in moving vehicles such as tanks or planes and even when the camera is held by human hands. For correct restoration of the degraded image we need to know the point spread function (PSF) of the blurring system. In this paper we propose a method to identify important parameters with which to characterize the PSF of the blur, given only the blurred image itself. A first step of this method has been suggested in a former paper where only the blur extent parameter was considered. The identification method here is based on the concept that image characteristics along the direction of motion are different than the characteristics in other directions. Depending on the PSF shape, the homogeneity and the smoothness of the blurred image in the motion direction are higher than in other directions. Furthermore, in the motion direction correlation exists between the pixels forming the blur of the original unblurred objects. The method proposed here identifies the direction and the extent of the PSF of the blur and evaluates its shape which depends on the type of motion during the exposure. Correct identification of the PSF parameters permits fast high resolution restoration of the blurred image.


Optical Engineering | 1997

Restoration of atmospherically blurred images according to weather-predicted atmospheric modulation transfer functions

Yitzhak Yitzhaky; Itai Dror; Norman S. Kopeika

Restoration for actual atmospherically blurred images is per- formed using an atmospheric Wiener filter that corrects simultaneously for both turbulence and aerosol blur by enhancing the image spectrum primarily at those high frequencies least affected by the jitter or random- ness in a turbulence modulation transfer function (MTF). The correction is based on weather-predicted rather than measured atmospheric MTFs. Both turbulence and aerosol MTFs are predicted using meteorological parameters measured with standard weather stations at the time and location where the image was recorded. A variety of weather conditions and seasons are considered. Past results have shown good correlation between measured and predicted atmospheric MTFs. Here, the pre- dicted MTFs are implemented in actual image restoration and quantita- tive analysis of the MTF improvement is presented. Corrections are shown also for turbulence blur alone, for aerosol blur alone, and for both together.


Applied Optics | 1999

Comparison of direct blind deconvolution methods for motion-blurred images

Yitzhak Yitzhaky; Ruslan Milberg; Sergei Yohaev; Norman S. Kopeika

Direct methods for restoration of images blurred by motion are analyzed and compared. The term direct means that the considered methods are performed in a one-step fashion without any iterative technique. The blurring point-spread function is assumed to be unknown, and therefore the image restoration process is called blind deconvolution. What is believed to be a new direct method, here called the whitening method, was recently developed. This method and other existing direct methods such as the homomorphic and the cepstral techniques are studied and compared for a variety of motion types. Various criteria such as quality of restoration, sensitivity to noise, and computation requirements are considered. It appears that the recently developed method shows some improvements over other older methods. The research presented here clarifies the differences among the direct methods and offers an experimental basis for choosing which blind deconvolution method to use. In addition, some improvements on the methods are suggested.


Journal of The Optical Society of America A-optics Image Science and Vision | 2004

Wideband enhancement of television images for people with visual impairments.

Eli Peli; Jeonghoon Kim; Yitzhak Yitzhaky; Robert Goldstein; Russell L. Woods

Wideband enhancement was implemented by detecting visually relevant edge and bar features in an image to produce a bipolar contour map. The addition of these contours to the original image resulted in increased local contrast of these features and an increase in the spatial bandwidth of the image. Testing with static television images revealed that visually impaired patients (n = 35) could distinguish the enhanced images and preferred them over the original images (and degraded images). Most patients preferred a moderate level of wideband enhancement, since they preferred natural-looking images and rejected visible artifacts of the enhancement. Comparison of the enhanced images with the originals revealed that the improvement in the perceived image quality was significant for only 22% of the patients. Possible reasons for the limited increase in perceived image quality are discussed, and improvements are suggested.


Optical Engineering | 2000

Restoration of an image degraded by vibrations using only a single frame

Yitzhak Yitzhaky; G. Boshusha; Y. Levy; Norman S. Kopeika

A recently developed method for the restoration of motion- blurred images is investigated and implemented for the special compli- cated case of image blur due to sinusoidal vibrations. Sinusoidal vibra- tions are analyzed in the context of blur identification and image restoration. The extent of the blur and the optical transfer function (OTF) are identified from the blurred image by a straightforward process with- out the use of iterative techniques. The blurred image is restored using a simple Wiener filter with the identified OTF. The main novel achievement is the use of only a single vibrated blurred image as input information, on which the restoration process is based. The various cases of blur types that depend on the imaging conditions are considered. Examples of blur identification and image restoration are presented.


Signal, Image and Video Processing | 2010

No-reference assessment of blur and noise impacts on image quality

Erez Cohen; Yitzhak Yitzhaky

The quality of images may be severely degraded in various situations such as imaging during motion, sensing through a diffusive medium, and low signal to noise. Often in such cases, the ideal un-degraded image is not available (no reference exists). This paper overviews past methods that dealt with no-reference (NR) image quality assessment, and then proposes a new NR method for the identification of image distortions and quantification of their impacts on image quality. The proposed method considers both noise and blur distortion types that may exist in the image. The same methodology employed in the spatial frequency domain is used to evaluate both distortion impacts on image quality, while noise power is further independently estimated in the spatial domain. Specific distortions addressed here include additive white noise, Gaussian blur and de-focus blur. Estimation results are compared to the true distortion quantities, over a set of 75 different images.


IEEE Sensors Journal | 2011

Inexpensive THz Focal Plane Array Imaging Using Miniature Neon Indicator Lamps as Detectors

Daniel Rozban; Assaf Levanon; Hezi Joseph; Avihai Aharon Akram; A. Abramovich; N. S. Kopeika; Yitzhak Yitzhaky; Alexander Belenky; Orly Yadid-Pecht

Development of focal plane arrays (FPAs) for mm wavelength and THz radiation is presented in this paper. The FPA is based upon inexpensive neon indicator lamp Glow Discharge Detectors (GDDs) that serve as pixels in the FPA. It was shown in previous investigations that inexpensive neon indicator lamp GDDs are quite sensitive to mm wavelength and THz radiation. The diameters of GDD lamps are typically 3-6 mm and thus the FPA can be diffraction limited. Development of an FPA using such devices as detectors is advantageous since the costs of such a lamp is around 30-50 cents per lamp, and it is a room temperature detector sufficiently fast for video frame rates. Recently, a new 8 × 8 GDD FPA VLSI control board was designed, constructed, and experimentally tested. First, THz images using this GDD FPA are given in this paper. By moving around the 8 × 8 pixel board appropriately in the image plane, 32 × 32 pixel images are also obtained and shown here, with much improved image quality because of much reduced pixelization distortion.

Collaboration


Dive into the Yitzhak Yitzhaky's collaboration.

Top Co-Authors

Avatar

Norman S. Kopeika

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

N. S. Kopeika

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Assaf Levanon

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

Oren Haik

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

Adrian Stern

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

Avihai Aharon

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

Doron Aloni

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

Eli Peli

Massachusetts Eye and Ear Infirmary

View shared research outputs
Researchain Logo
Decentralizing Knowledge