Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ali Alsam is active.

Publication


Featured researches published by Ali Alsam.


Journal of The Optical Society of America A-optics Image Science and Vision | 2007

Calibrating color cameras using metameric blacks

Ali Alsam; Reiner Lenz

Spectral calibration of digital cameras based on the spectral data of commercially available calibration charts is an ill-conditioned problem that has an infinite number of solutions. We introduce a method to estimate the sensors spectral sensitivity function based on metamers. For a given patch on the calibration chart we construct numerical metamers by computing convex linear combinations of spectra from calibration chips with lower and higher sensor response values. The difference between the measured reflectance spectrum and the numerical metamer lies in the null space of the sensor. For each measured spectrum we use this procedure to compute a collection of color signals that lie in the null space of the sensor. For a collection of such spaces we compute the robust principal components, and we obtain an estimate of the sensor by computing the common null space spanned by these vectors. Our approach has a number of advantages over standard techniques: It is robust to outliers and is not dominated by larger response values, and it offers the ability to evaluate the goodness of the solution where it is possible to show that the solution is optimal, given the data, if the calculated range is one dimensional.


scandinavian conference on image analysis | 2009

Colour Gamut Mapping as a Constrained Variational Problem

Ali Alsam; Ivar Farup

We present a novel, computationally efficient, iterative, spatial gamut mapping algorithm. The proposed algorithm offers a compromise between the colorimetrically optimal gamut clipping and the most successful spatial methods. This is achieved by the iterative nature of the method. At iteration level zero, the result is identical to gamut clipping. The more we iterate the more we approach an optimal, spatial, gamut mapping result. Optimal is defined as a gamut mapping algorithm that preserves the hue of the image colours as well as the spatial ratios at all scales. Our results show that as few as five iterations are sufficient to produce an output that is as good or better than that achieved in previous, computationally more expensive, methods. Being able to improve upon previous results using such low number of iterations allows us to state that the proposed algorithm is O (N ), N being the number of pixels. Results based on a challenging small destination gamut supports our claims that it is indeed efficient.


international symposium on visual computing | 2012

Spatial Colour Gamut Mapping by Orthogonal Projection of Gradients onto Constant Hue Lines

Ali Alsam; Ivar Farup

We present a computationally efficient, artifact-free, spatial gamut mapping algorithm. The proposed algorithm offers a compromise between the colorimetrically optimal gamut clipping and an ideal spatial gamut mapping. This is achieved by the iterative nature of the method: At iteration level zero, the result is identical to gamut clipping. The more we iterate the more we approach an optimal spatial gamut mapping result. Our results show that a low number of iterations, 20-30, is sufficient to produce an output that is as good or better than that achieved in previous, computationally more expensive, methods. More importantly, we introduce a new method to calculate the gradients of a vector valued image by means of a projection operator which guarantees that the hue of the gamut mapped colour vector is identical to the original. Furthermore, the algorithm results in no visible halos in the gamut mapped image a problem which is common in previous spatial methods. Finally, the proposed algorithm is fast- Computational complexity is O(N), N being the number of pixels. Results based on a challenging small destination gamut supports our claims that it is indeed efficient.


Journal of The Optical Society of America A-optics Image Science and Vision | 2007

Metamer Sets without Spectral Calibration

Ali Alsam; Graham D. Finlayson

The set of metamers for a given device response can be calculated given the devices spectral sensitivities. Knowledge of the metamer set has been useful in practical applications such as color correction and reflectance recovery. Unfortunately, the device sensitivities of a camera or scanner are not known, and they are difficult to estimate reliably outside the laboratory. We show how metamer sets can be calculated when a devices spectral sensitivities are not known. The result is built on two observations: first, the set of all reflectance spectra consists of convex combinations of certain basic colors that tend to be very bright (or dark) and have high chroma; second, the convex combinations that describe reflectance spectra result in convex combinations of red-green-blue (RGB) values. Thus, given an RGB value, it is possible to find the set of convex combinations of the RGB values of the basic colors that generate the same RGB value. The corresponding set of convex combinations of the basic spectra is the metamer set.


computational color imaging workshop | 2011

Spatial colour gamut mapping by means of anisotropic diffusion

Ali Alsam; Ivar Farup

We present a computationally efficient, artifact-free, spatial colour gamut mapping algorithm. The proposed algorithm offers a compromise between the colorimetrically optimal gamut clipping and an ideal spatial gamut mapping. It exploits anisotropic diffusion to reduce the introduction of halos often appearing in spatially gamut mapped images. It is implemented as an iterative method. At iteration level zero, the result is identical to gamut clipping. The more we iterate the more we approach an optimal, spatial gamut mapping result. Our results show that a low number of iterations, 10-20, is sufficient to produce an output that is as good or better than that achieved in previous, computationally more expensive, methods. The computational complexity for one iteration is O(N), N being the number of pixels. Results based on a challenging small destination gamut supports our claims that it is indeed efficient.


international conference on pattern recognition | 2010

Colour Constant Image Sharpening

Ali Alsam

In this paper, we introduce a new sharpening method which guarantees colour constancy and resolves the problem of equiluminance colours. The algorithm is similar to unsharp masking in that the gradients are calculated at different scales by blurring the original with a variable size kernel. The main difference is in the blurring stage where we calculate the average of an n times n neighborhood by projecting each colour vector onto the space of the center pixel before averaging. Thus starting with the center pixel we define a projection matrix onto the space of that vector. Each neighboring colour is then projected onto the center and the result is summed up. The projection step results in an average vector which shares the direction of the original center pixel. The difference between the center pixel and the average is by definition a vector which is scalar away from the center pixel. Thus adding the average to the center pixel is guaranteed not to result in colour shifts. This projection step is also shown to remedy the problem of equiluminance colours and can be used for


scandinavian conference on image analysis | 2009

Contrast Enhancing Colour to Grey

Ali Alsam

m


Journal of The Optical Society of America A-optics Image Science and Vision | 2014

Robust metric for the evaluation of visual saliency algorithms

Ali Alsam; Puneet Sharma

-dimensional data. Finally, the results indicate that the new sharpening method results in better sharpening than that achieved using unsharp masking with noticeably less halos around strong edges. The latter aspect of the algorithm is believed to be due to the asymmetric nature of the projection step.


scandinavian conference on image analysis | 2013

Asymmetry as a Measure of Visual Saliency

Ali Alsam; Puneet Sharma; Anette Wrålsen

A spatial algorithm to convert colour images to greyscale is presented. The method is very fast and results in increased local and global contrast. At each image pixel, three weights are calculated. These are defined as the difference between the blurred luminance image and the colour channels: red, green and blue. The higher the difference the more weight is given to that channel in the conversion. The method is multi-resolution and allows the user to enhance contrast at different scales. Results based on three colour images show that the method results in higher contrast than luminance and two spatial methods: Socolinsky and Wolff [1,2] and Alsam and Drew [3].


international symposium on visual computing | 2013

WHAT THE EYE DID NOT SEE — A FUSION APPROACH TO IMAGE CODING

Ali Alsam; Hans Jakob Rivertz; Puneet Sharma

In this paper, we analyzed eye fixation data obtained from 15 observers and 1003 images. When studying the eigen-decomposition of the correlation matrix constructed based on the fixation data of one observer viewing all images, it was observed that 23% of the data can be accounted for by one eigenvector. This finding implies a repeated viewing pattern that is independent of image content. Examination of this pattern revealed that it was highly correlated with the center region of the image. The presence of a repeated viewing pattern raised the following question: can we use the statistical information contained in the first eigenvector to filter out the fixations that were part of the pattern from those that are image feature dependent? To answer this question we designed a robust AUC metric that uses statistical analysis to better judge the goodness of the different saliency algorithms.

Collaboration


Dive into the Ali Alsam's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ivar Farup

Gjøvik University College

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andrei Ouglov

Gjøvik University College

View shared research outputs
Top Co-Authors

Avatar

Marius Pedersen

Gjøvik University College

View shared research outputs
Top Co-Authors

Avatar

Peter Nussbaum

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Rune Hjelsvold

Gjøvik University College

View shared research outputs
Researchain Logo
Decentralizing Knowledge