Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jorge Llacer is active.

Publication


Featured researches published by Jorge Llacer.


IEEE Transactions on Medical Imaging | 1987

Stopping Rule for the MLE Algorithm Based on Statistical Hypothesis Testing

Eugene Veklerov; Jorge Llacer

It is known that when the maximum likelihood estimator (MLE) algorithm passes a certain point, it produces images that begin to deteriorate. We propose a quantitative criterion with a simple probabilistic interpretation that allows the user to stop the algorithm just before this effect begins. The MLE algorithm searches for the image that has the maximum probability to generate the projection data. The underlying assumption of the algorithm is a Poisson distribution of the data. Therefore, the best image, according to the MLE algorithm, is the one that results in projection means which are as close to the data as possible. It is shown that this goal conflicts with the assumption that the data are Poisson-distributed. We test a statistical hypothesis whereby the projection data could have been generated by the image produced after each iteration. The acceptance or rejection of the hypothesis is based on a parameter that decreases as the images improve and increases as they deteriorate. We show that the best MLE images, which pass the test, result in somewhat lower noise in regions of high activity than the filtered back-projection results and much improved images in low activity regions. The applicability of the proposed stopping rule to other iterative schemes is discussed.


IEEE Transactions on Nuclear Science | 1985

Matrix-Based Image Reconstruction Methods for Tomography

Jorge Llacer; John Meng

Matrix methods of image reconstruction have not been used, in general, because of the large size of practical matrices, ill condition upon inversion and the success of Fourier-based techniques. An exception is the work that has been done at the Lawrence Berkeley Laboratory for imaging with accelerated radioactive ions. An extension of that work into more general imaging problems shows that, with a correct formulation of the problem, positron tomography with ring geometries results in well behaved matrices which can be used for image reconstruction with no distortion of the point response in the field of view and flexibility in the design of the instrument. Maximum Likelihood Estimator methods of reconstruction, which use the system matrices tailored to specific instruments and do not need matrix inversion, are shown to result in good preliminary images. A parallel processing computer structure based on multiple inexpensive microprocessors is proposed as a system to implement the matrix-MLE methods.


IEEE Transactions on Nuclear Science | 1979

Theory of Imaging with a Very Limited Number of Projections

Jorge Llacer

A theory of imaging for detector systems with a very limited number of projections has been developed. The relationships between a matrix which determines the system, its eigenvectors and eigenvalues, and the physical characteristics of the detector system are analyzed in order to assist in the most effective design of an instrument. It is shown that reconstruction methods for complete data sets are essentially an extension of the methods developed for incomplete sets. The concept of mathematical sweeping to replace mechanical detector motion in incomplete detector systems is demonstrated.


International Journal of Radiation Oncology Biology Physics | 1981

High energy beams of radioactive nuclei and their biomedical applications

Aloke Chatterjee; Edward L. Alpen; Cornelius A. Tobias; Jorge Llacer; J. Alonso

Abstract It is possible to produce energetic beam of radioactive nuclei, as secondary beams, from the heavy-particle compound accelerator called BEVALAC. These beam can be focused into experimental areas without significant contamination using suitable magnetic filters and proper beam-optics. Properly selected high-energy beam of radioactive nuclei (those which decay by positron emission) can provide a truly unique opportunity to evaluate the effectiveness of these beam in localizing the Bragg peak on a tumor volume—necessary in heavy-particle therapy. Preliminary data are presented here to demonstrate the possible use of these beam in radiotherapy treatment-planning verification.


Publications of the Astronomical Society of the Pacific | 1993

A general Bayesian image reconstruction algorithm with entropy prior: preliminary application to HST data

Jorge Núñez; Jorge Llacer

This paper describes a general Bayesian iterative algorithm with entropy prior for image reconstruction. It solves the cases of both pure Poisson data and Poisson data with Gaussian readout noise. The algorithm maintains positivity of the solution; it includes case-specific prior information (default map) and flatfield corrections; it removes background and can be acclerated to be faster than the Richardson-Lucy algorithm. In order to determine the hyperparameter that balances the entropy and likelihood terms in the Bayesian approach, we have used a likeliehood cross-validation technique. Cross-validation is more robust than other methods because it is less demanding in terms of the knowledge of exact data characteristics and of the point spread function. We have used the algorithm to reconstruct successfully images obtained in different space and ground based imaging situations. It has been possible to recover most of the original intended capabilities of the Hubble Space Telescope Wide Field and Planetary Camera and Faint Object Camera from images obtained in their present state. Semi-real situations for the future Wide Field Planetary Camera 2 show that even after the repair of the spherical aberration problem, image reconstruction can play a key role in improving the resolution of the cameras, well beyond the design of the Hubble instruments. We also show that ground based images can be reconstructed successfully with the algorithm. A technique which consists of dividing the CCD observations into two frames, with one half the exposure time each, emerges as a recommended procedure for the utilization of the described algorithms. We have compared our technique with two commonly used reconstruction algorithms: the Richardson-Lucy and the Cambridge Maximum Entropy algorithms.


Neural Networks | 2003

Astronomical image segmentation by self-organizing neural networks and wavelets

Jorge Núñez; Jorge Llacer

Standard image segmentation methods may not be able to segment astronomical images because their special nature. We present an algorithm for astronomical image segmentation based on self-organizing neural networks and wavelets. We begin by performing wavelet decomposition of the image. The segmentation process has two steps. In the first we separate the stars and other prominent objects using the second plane (w(2)) of the wavelet decomposition, which has little noise but retains enough signal to represent those objects. This method was as least as effective as the traditional source extraction methods in isolating bright objects both from the background and from extended sources. In the second step the rest of the image (extended sources and background) is segmented using a self-organizing neural network. The result is a predetermined number of clusters, which we associate with extended regions plus a small region for each star or bright object. We have applied the algorithm to segment images of both galaxies and planets. The results show that the simultaneous use of all the scales in the self-organizing neural network helps the segmentation process, since it takes into account not only the intensity level, but also both the high and low frequencies present in the image. The connectivity of the regions obtained also shows that the algorithm is robust in the presence of noise. The method can also be applied to restored images.


IEEE Transactions on Medical Imaging | 1984

Imaging by Injection of Accelerated Radioactive Particle Beams

Jorge Llacer; A. Chatterjee; E. L. Alpen; W. Saunders; S. Andreae; H. C. Jackson

The process of imaging by detection of the annihilation gamma rays generated from positron emitters which have been injected into a patient by a particle accelerator has been studied in detail. The relationships between patient dose and injected activity have been calculated for C-11, N-13, C-15, F-17, and Ne-19 and measured for C-11 and Ne-19 with good agreement with the calculations. The requirements for imaging of the small amounts of activity that can be injected safely have been analyzed in terms of one specific application of the radioactive beam injection technique, that of Bragg peak localization in support of radiotherapy by heavy ions. The characteristics of an existing camera with sufficient sensitivity and spatial accuracy for that task are described. Results of the calculations of radioactive beam flux requirements are shown.


IEEE Transactions on Nuclear Science | 1986

Towards a Practical Implementation of the MLE Algorithm for Positron Emission Tomography

Jorge Llacer; Sypko W. Andreae; Eugene Veklerov; Edward J. Hoffman

Recognizing that the quality of images obtained by application of the Maximum Likelihood Estimator (MLE) to Positron Emission Tomography (PET) and Single Photon Emission Tomography (SPECT) appears to be substantially better than those obtained by conventional methods, we have started to develop methods that will facilitate the necessary research for a good evaluation of the algorithm and may lead to its practical application for research and routine tomography. We have found that the non-linear MLE algorithm can be used with pixel sizes which are smaller than the sampling distance, without interpolation, obtaining excellent resolution and no noticeable increase in noise. We have studied the role of symmetry in reducing the amount of matrix element storage requirements for full size applications of the algorithm and have used that concept to carry out two reconstructions of the Derenzo phantom with data from the ECAT-III instrument. The results show excellent signal-to-noise (S/N) ratio, particularly for data with low total counts, excellent sharpness, but low contrast at high frequencies when using the Shepp-Vardi model for probability matrices.


IEEE Transactions on Nuclear Science | 1979

An Imaging Instrument for Positron Emitting Heavy Ion Beam Injection

Jorge Llacer; Aloke Chatterjee; Horace C. Jackson; Jaff C. Lin; Maria V. Zunzunegui

The design and performance of an instrument for the imaging of coincidence annihilation gamma rays emitted from the end point of the trajectories of radioactive high-energy heavy ions is described. The positron-emitting heavy ions are the result of nuclear fragmentation of accelerated heavy ions used in cancer theraphy or diagnostic medicine. The instrument constructed is capable of locating the ion beam trajectory end point within 1 mm for an injected activity of 200 nanoCi in a measurement time of 1 sec. in some favorable conditions. Limited imaging in three dimensions is also demonstrated.


The Marketplace for Industrial Lasers | 1987

On The Convergence Of The Maximum Likelihood Estimator Method Of Tomographic Image Reconstruction

Jorge Llacer; Eugene Veklerov; Edward J. Hoffman

The Maximum Likelihood Estimator (MLE) method of image reconstruction has been reported to exhibit image deterioration in regions of expected uniform activity as the number of iterations increases beyond a certain point. This apparent instability raises questions as to the usefulness of a method that yields images at different stages of the reconstruction that could have different medical interpretations. In this paper we look in some detail into the question of convergence of MLE solutions at a large number of iterations and show that the MLE method converges towards the image that it was designed to yield, i.e. the image which has the maximum likelihood to have generated the specific projection data resulting from a measurement. We also show that the maximum likelihood image can be a very deteriorated version of the true source image and that only as the number of counts in the projection data becomes very high, will the maximum likelihood image converge towards an acceptable reconstruction.

Collaboration


Dive into the Jorge Llacer's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

R.W. Sorensen

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

William Chu

University of California

View shared research outputs
Researchain Logo
Decentralizing Knowledge