Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Hossein Talebi is active.

Publication


Featured researches published by Hossein Talebi.


IEEE Transactions on Image Processing | 2014

Global Image Denoising

Hossein Talebi; Peyman Milanfar

Most existing state-of-the-art image denoising algorithms are based on exploiting similarity between a relatively modest number of patches. These patch-based methods are strictly dependent on patch matching, and their performance is hamstrung by the ability to reliably find sufficiently similar patches. As the number of patches grows, a point of diminishing returns is reached where the performance improvement due to more patches is offset by the lower likelihood of finding sufficiently close matches. The net effect is that while patch-based methods, such as BM3D, are excellent overall, they are ultimately limited in how well they can do on (larger) images with increasing complexity. In this paper, we address these shortcomings by developing a paradigm for truly global filtering where each pixel is estimated from all pixels in the image. Our objectives in this paper are two-fold. First, we give a statistical analysis of our proposed global filter, based on a spectral decomposition of its corresponding operator, and we study the effect of truncation of this spectral decomposition. Second, we derive an approximation to the spectral (principal) components using the Nyström extension. Using these, we demonstrate that this global filter can be implemented efficiently by sampling a fairly small percentage of the pixels in the image. Experiments illustrate that our strategy can effectively globalize any existing denoising filters to estimate each pixel using all pixels in the image, hence improving upon the best patch-based methods.


IEEE Transactions on Image Processing | 2013

How to SAIF-ly Boost Denoising Performance

Hossein Talebi; Xiang Zhu; Peyman Milanfar

Spatial domain image filters (e.g., bilateral filter, non-local means, locally adaptive regression kernel) have achieved great success in denoising. Their overall performance, however, has not generally surpassed the leading transform domain-based filters (such as BM3-D). One important reason is that spatial domain filters lack efficiency to adaptively fine tune their denoising strength; something that is relatively easy to do in transform domain method with shrinkage operators. In the pixel domain, the smoothing strength is usually controlled globally by, for example, tuning a regularization parameter. In this paper, we propose spatially adaptive iterative filtering (SAIF) a new strategy to control the denoising strength locally for any spatial domain method. This approach is capable of filtering local image content iteratively using the given base filter, and the type of iteration and the iteration number are automatically optimized with respect to estimated risk (i.e., mean-squared error). In exploiting the estimated local signal-to-noise-ratio, we also present a new risk estimator that is different from the often-employed SURE method, and exceeds its performance in many cases. Experiments illustrate that our strategy can significantly relax the base algorithms sensitivity to its tuning (smoothing) parameters, and effectively boost the performance of several existing denoising filters to generate state-of-the-art results under both simulated and practical conditions.


IEEE Transactions on Image Processing | 2014

Nonlocal image editing.

Hossein Talebi; Peyman Milanfar

In this paper, we introduce a new image editing tool based on the spectrum of a global filter computed from image affinities. Recently, it has been shown that the global filter derived from a fully connected graph representing the image can be approximated using the Nyström extension. This filter is computed by approximating the leading eigenvectors of the filter. These orthonormal eigenfunctions are highly expressive of the coarse and fine details in the underlying image, where each eigenvector can be interpreted as one scale of a data-dependent multiscale image decomposition. In this filtering scheme, each eigenvalue can boost or suppress the corresponding signal component in each scale. Our analysis shows that the mapping of the eigenvalues by an appropriate polynomial function endows the filter with a number of important capabilities, such as edge-aware sharpening, denoising, tone manipulation, and abstraction, to name a few. Furthermore, the edits can be easily propagated across the image.


IEEE Transactions on Computational Imaging | 2016

Fast Multilayer Laplacian Enhancement

Hossein Talebi; Peyman Milanfar

A novel, fast, and practical way of enhancing images is introduced in this paper. Our approach builds on Laplacian operators of well-known edge-aware kernels, such as bilateral and nonlocal means, and extends these filters capabilities to perform more effective and fast image smoothing, sharpening, and tone manipulation. We propose an approximation of the Laplacian, which does not require normalization of the kernel weights. Multiple Laplacians of the affinity weights endow our method with progressive detail decomposition of the input image from fine to coarse scale. These image components are blended by a structure mask, which avoids noise/artifact magnification or detail loss in the output image. Contributions of the proposed method to existing image editing tools are: 1) low computational and memory requirements, making it appropriate for mobile device implementations (e.g., as a finish step in a camera pipeline); and 2) a range of filtering applications from detail enhancement to denoising with only a few control parameters, enabling the user to apply a combination of various (and even opposite) filtering effects.


international conference on image processing | 2012

Improving denoising filters by optimal diffusion

Hossein Talebi; Peyman Milanfar

Kernel based methods have recently been used widely in image denoising. Tuning the parameters of these algorithms directly affects their performance. In this paper, an iterative method is proposed which optimizes the performance of any kernel based denoising algorithm in the mean-squared error (MSE) sense, even with arbitrary parameters. In this work we estimate the MSE in each image patch, and use this estimate to guide the iterative application to a stop, hence leading to improve performance. We propose a new estimator for the risk (i.e. MSE) which is different than the often-employed SURE method. We illustrate that the proposed risk estimate can outperform SURE in many instances.


international conference on image processing | 2014

Global denoising is asymptotically optimal

Hossein Talebi; Peyman Milanfar

In this paper an upper bound on the decay rate of the mean-squared error for global image denoising is derived. As image size increases, this upper bound decays to zero; that is, the global denoising is asymptotically optimal. Unlike patch-based methods such as BM3D, this property only holds for global denoising schemes. In practice, and as demonstrated in this work, this performance gap between patch-based and global denoisers can grow rapidly with image size.


canadian conference on electrical and computer engineering | 2009

Contourlet based image compression using controlled modification of coefficients

Nader Karimi; Shadrokh Samavi; Shahram Shirani; Hossein Talebi; S.M.A Zaynolabedin

A new compression algorithm is proposed in this paper which uses the contourlet transform. Unlike contourlet-based Non-Linear Approximation (NLA) compression algorithms, the proposed algorithm modifies the coefficients in a controlled manner. The modification is performed so that the difference between a modified coefficient and its original value is within a certain range. To achieve higher compression, the modifications are performed with the goal of minimizing the entropy of the coefficients. The implementation results show that our algorithm produces images with higher PSNRs, for similar bit-rate conditions, as compared to NLA compression algorithms. Furthermore, the visual quality of the images produced by our algorithm is higher than the mentioned NLA algorithms. The implementation results also show the superiority of our algorithm over WBCT algorithm which is based on the joint application of wavelet and contourlet transforms.


Siam Journal on Imaging Sciences | 2016

Asymptotic Performance of Global Denoising

Hossein Talebi; Peyman Milanfar

We provide an upper bound on the rate of convergence of the mean-squared error for global image denoising and illustrate that this upper bound decays with increasing image size. Hence, global denoising is asymptotically optimal. At least in an oracle scenario this property does not hold for patch-based methods such as BM3D, thereby limiting their performance for large images. As observed in practice and shown in this work, this gap in performance is small for moderate size images, but it can grow quickly with image size.


ieee global conference on signal and information processing | 2013

Global image editing using the spectrum of affinity matrices

Hossein Talebi; Peyman Milanfar

In this work we introduce a new image editing tool, based on the spectrum of a global filter computed from image affinities. Recently, we have shown that the global filter derived from a fully connected graph representing the image, can be approximated using the Nyström extension [1]. This filter is computed by approximating the leading eigenvectors of the filter. These orthonormal eigenfunctions are highly expressive of the coarse and fine details in the underlying image, where each eigenvector can be interpreted as one scale of a data-dependent multiscale image decomposition. In this filtering scheme, each eigenvalue can boost or suppress the corresponding signal component in each scale. Our analysis shows that the mapping of the eigenvalues by an appropriate polynomial function endows the filter with a number of important capabilities, such as edge-aware sharpening, denoising and tone manipulation.


international conference on image processing | 2016

A new class of image filters without normalization

Peyman Milanfar; Hossein Talebi

When applying a filter to an image, it often makes practical sense to maintain the local brightness level from input to output image. This is achieved by normalizing the filter coefficients so that they sum to one. This concept is generally taken for granted, but is particularly important where nonlinear filters such as the bilateral or and non-local means are concerned, where the effect on local brightness and contrast can be complex. Here we present a method for achieving the same level of control over the local filter behavior without the need for this normalization. Namely, we show how to closely approximate any normalized filter without in fact needing this normalization step. This yields a new class of filters. We derive a closed-form expression for the approximating filter and analyze its behavior, showing it to be easily controlled for quality and nearness to the exact filter, with a single parameter. Our experiments demonstrate that the un-normalized affinity weights can be effectively used in applications such as image smoothing, sharpening and detail enhancement.

Collaboration


Dive into the Hossein Talebi's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xiang Zhu

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge