Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jean-Marie Becker is active.

Publication


Featured researches published by Jean-Marie Becker.


Image and Vision Computing | 2008

Plane-based camera self-calibration by metric rectification of images

Jean-François Menudet; Jean-Marie Becker; Thierry Fournel; Catherine Mennessier

Plane-based self-calibration aims at the computation of camera intrinsic parameters from homographies relating multiple views of the same unknown planar scene. This paper proposes a straightforward geometric statement of plane-based self-calibration, through the concept of metric rectification of images. A set of constraints is derived from a decomposition of metric rectification in terms of intrinsic parameters and planar scene orientation. These constraints are then solved using an optimization framework based on the minimization of a geometrically motivated cost function. The link with previous approaches is demonstrated and our method appears to be theoretically equivalent but conceptually simpler. Moreover, a solution dealing with radial distortion is introduced. Experimentally, the method is compared with plane-based calibration and very satisfactory results are obtained. Markerless self-calibration is demonstrated using an intensity-based estimation of the inter-image homographies.


Journal of The Optical Society of America A-optics Image Science and Vision | 2007

Compositional reflectance and transmittance model for multilayer specimens

Mathieu Hébert; Roger D. Hersch; Jean-Marie Becker

We propose a compositional model for predicting the reflectance and the transmittance of multilayer specimens composed of layers having possibly distinct refractive indices. The model relies on the laws of geometrical optics and on a description of the multiple reflection-transmission of light between the different layers and interfaces. The highly complex multiple reflection-transmission process occurring between several superposed layers is described by Markov chains. An optical element such as a layer or an interface forms a biface. The multiple reflection-transmission process is developed for a superposition of two bifaces. We obtain general composition formulas for the reflectance and the transmittance of a pair of layers and/or interfaces. Thanks to these compositional expressions, we can calculate the reflectance and the transmittance of three or more superposed bifaces. The model is applicable to regular compositions of bifaces, i.e., multifaces having on each face an angular light distribution that remains constant along successive reflection and transmission events. Kubelkas layering model, Saundersons correction of the Kubelka-Munk model, and the Williams-Clapper model of a color layer superposed on a diffusing substrate are special cases of the proposed compositional model.


Cement & Concrete Composites | 2001

SURFACE STATE ANALYSIS BY MEANS OF CONFOCAL MICROSCOPY

Jean-Marie Becker; Stéphane Grousson; Michel Jourlin

Abstract This article aims at the illustration of the potentialities offered by new imaging technologies: in the present case, confocal microscopy. Such an approach gives access to surface information like porosity, wear, cracking and roughness, and is thus a very powerful tool for the evaluation on the surface state of various materials: concrete, ceramics, aluminum alloys, textiles. A perspective is given to exploit and visualize this new data in the best way.


International Journal of Computer Vision | 2015

Fast Approximations of Shift-Variant Blur

Loïc Denis; Éric Thiébaut; Ferréol Soulez; Jean-Marie Becker; Rahul Mourya

Image deblurring is essential in high resolution imaging, e.g., astronomy, microscopy or computational photography. Shift-invariant blur is fully characterized by a single point-spread-function (PSF). Blurring is then modeled by a convolution, leading to efficient algorithms for blur simulation and removal that rely on fast Fourier transforms. However, in many different contexts, blur cannot be considered constant throughout the field-of-view, and thus necessitates to model variations of the PSF with the location. These models must achieve a trade-off between the accuracy that can be reached with their flexibility, and their computational efficiency. Several fast approximations of blur have been proposed in the literature. We give a unified presentation of these methods in the light of matrix decompositions of the blurring operator. We establish the connection between different computational tricks that can be found in the literature and the physical sense of corresponding approximations in terms of equivalent PSFs, physically-based approximations being preferable. We derive an improved approximation that preserves the same desirable low complexity as other fast algorithms while reaching a minimal approximation error. Comparison of theoretical properties and empirical performances of each blur approximation suggests that the proposed general model is preferable for approximation and inversion of a known shift-variant blur.


Journal of Optics | 2008

Correspondence between continuous and discrete two-flux models for reflectance and transmittance of diffusing layers

Mathieu Hébert; Jean-Marie Becker

This paper provides a theoretical connection between two different mathematical models dedicated to the reflectance and the transmittance of diffusing layers. The Kubelka–Munk model proposes a continuous description of scattering and absorption for two opposite diffuse fluxes in a homogeneous layer (continuous two-flux model). On the other hand, Kubelkas layering model describes the multiple reflections and transmissions of light taking place between various superposed diffusing layers (discrete two-flux model). The compatibility of these two models is shown. In particular, the Kubelka–Munk reflectance and transmittance expressions are retrieved, using Kubelkas layering model, with mathematical arguments using infinitely thin sublayers. A new approach to the Kubelka–Munk expressions is thus obtained, giving, moreover, new details for physical interpretation of the Kubelka–Munk theory.


ieee nuclear science symposium | 2011

A new representation and projection model for tomography, based on separable B-splines

Fabien Momey; Loïc Denis; Catherine Mennessier; Éric Thiébaut; Jean-Marie Becker; Laurent Desbat

Data modelization in tomography is a key point for iterative reconstruction. The design of the projector, i.e. the numerical model of projection, is mostly influenced by the representation of the object of interest, decomposed on a discrete basis of functions.


Measurement Science and Technology | 2009

Specific design requirements for a reliable slope and curvature measurement standard

Pauline Rose; Yves Surrel; Jean-Marie Becker

In the domain of surface metrology, direct altitude measurements are increasingly being challenged by slope and curvature measuring methods because of the numerous advantages of the latter measurands. Reference standards are needed to assess the quality of these slope and curvature measuring systems and thus to allow their spread into industry. Up to now, no specific slope or curvature measurement standard has been defined; rather, existing standards are designed in terms of altitude profile specifications. This paper details our experience on a reference manufactured piece intended for deflectometric slope measurement validation. An important discrepancy between the piece specifications and the measurements led us to cross-check our deflectometric measurements with differential interferometry. The results obtained using the two measurement methods matched very well. A plausible explanation of the discrepancy between the piece specifications and the measurement results is that small altitude variations may have considerable effects on slopes and curvatures. This real example raises the question of the specific design features for slope and curvature measurement standards and highlights the importance of the chosen altitude profile.


international conference on image processing | 2015

Augmented Lagrangian without alternating directions: Practical algorithms for inverse problems in imaging

Rahul Mourya; Loïc Denis; Jean-Marie Becker; Éric Thiébaut

Several problems in signal processing and machine learning can be casted as optimization problems. In many cases, they are of large-scale, nonlinear, have constraints, and may be nonsmooth in the unknown parameters. There exists plethora of fast algorithms for smooth convex optimization, but these algorithms are not readily applicable to nonsmooth problems, which has led to a considerable amount of research in this direction. In this paper, we propose a general algorithm for nonsmooth bound-constrained convex optimization problems. Our algorithm is instance of the so-called augmented Lagrangian, for which theoretical convergence is well established for convex problems. The proposed algorithm is a blend of superlinearly convergent limited memory quasi-Newton method, and proximal projection operator. The initial promising numerical results for total-variation based image deblurring show that they are as fast as the best existing algorithms in the same class, but with fewer and less sensitive tuning parameters, which makes a huge difference in practice.


IEEE Transactions on Image Processing | 2015

Spline Driven: High Accuracy Projectors for Tomographic Reconstruction From Few Projections

Fabien Momey; Loïc Denis; Catherine Burnier; Éric Thiébaut; Jean-Marie Becker; Laurent Desbat

Tomographic iterative reconstruction methods need a very thorough modeling of data. This point becomes critical when the number of available projections is limited. At the core of this issue is the projector design, i.e., the numerical model relating the representation of the object of interest to the projections on the detector. Voxel driven and ray driven projection models are widely used for their short execution time in spite of their coarse approximations. Distance driven model has an improved accuracy but makes strong approximations to project voxel basis functions. Cubic voxel basis functions are anisotropic, accurately modeling their projection is, therefore, computationally expensive. Both smoother and more isotropic basis functions better represent the continuous functions and provide simpler projectors. These considerations have led to the development of spherically symmetric volume elements, called blobs. Set apart their isotropy, blobs are often considered too computationally expensive in practice. In this paper, we consider using separable B-splines as basis functions to represent the object, and we propose to approximate the projection of these basis functions by a 2D separable model. When the degree of the B-splines increases, their isotropy improves and projections can be computed regardless of their orientation. The degree and the sampling of the B-splines can be chosen according to a tradeoff between approximation quality and computational complexity. We quantitatively measure the good accuracy of our model and compare it with other projectors, such as the distance-driven and the model proposed by Long et al. From the numerical experiments, we demonstrate that our projector with an improved accuracy better preserves the quality of the reconstruction as the number of projections decreases. Our projector with cubic B-splines requires about twice as many operations as a model based on voxel basis functions. Higher accuracy projectors can be used to improve the resolution of the existing systems, or to reduce the number of projections required to reach a given resolution, potentially reducing the dose absorbed by the patient.


international symposium on signals, circuits and systems | 2007

Texture Classification by ICA

Daniela Coltuc; Thierry Fournel; Jean-Marie Becker

ICA (Independent Component Analysis) is a mathematical tool traditionally employed for source separation. In this paper, we test its ability for texture analysis, in order to provide a new texture classification method. From the multitude of the existing algorithms, we have chosen FastICA, a version based on the forth order statistics of the analyzed signal. By FastICA, a texture is decomposed in a weighted sum of components with a rather high degree of independence. Each component is further described by means of its negentropy, which is a measure of the nongaussianity. We show experimentally, that the three most nongaussian components of each analyzed texture are able to cluster the test samples.

Collaboration


Dive into the Jean-Marie Becker's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Éric Thiébaut

École normale supérieure de Lyon

View shared research outputs
Top Co-Authors

Avatar

Loïc Denis

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Daniela Coltuc

Politehnica University of Bucharest

View shared research outputs
Top Co-Authors

Avatar

Fabien Momey

École normale supérieure de Lyon

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Laurent Desbat

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Michel Jourlin

École supérieure de chimie physique électronique de Lyon

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge