Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jesse L. Barlow is active.

Publication


Featured researches published by Jesse L. Barlow.


SIAM Journal on Numerical Analysis | 1990

Computing accurate eigensystems of scaled diagonally dominant matrices

Jesse L. Barlow; James Demmel

When computing eigenvalues of symmetric matrices and singular values of general matrices in finite precision arithmetic we in general only expect to compute them with an error bound proportional to the product of machine precision and the norm of the matrix. In particular, we do not expect to compute tiny eigenvalues and singular values to high relative accuracy. There are some important classes of matrices where we can do much better, including bidiagonal matrices, scaled diagonally dominant matrices, and scaled diagonally dominant definite pencils. These classes include many graded matrices, and all symmetric positive definite matrices which can be consistently ordered (and thus all symmetric positive definite tridiagonal matrices). In particular, the singular values and eigenvalues are determined to high relative precision independent of their magnitudes, and there are algorithms to compute them this accurately. The eigenvectors are also determined more accurately than for general matrices, and may be computed more accurately as well. This work extends results of Kahan and Demmel for bidiagnoal and tridiagonal matrices. 17 refs.


SIAM Journal on Scientific Computing | 2005

Efficient Minimization Methods of Mixed l 2- l 1 and l 1- l 1 Norms for Image Restoration

Haoying Fu; Michael K. Ng; Mila Nikolova; Jesse L. Barlow

Image restoration problems are often solved by finding the minimizer of a suitable objective function. Usually this function consists of a data-fitting term and a regularization term. For the least squares solution, both the data-fitting and the regularization terms are in the


international conference on machine learning | 2006

Semi-supervised nonlinear dimensionality reduction

Xin Yang; Haoying Fu; Hongyuan Zha; Jesse L. Barlow

\ell


Computing | 1985

On roundoff error distributions in floating point and logarithmic arithmetic

Jesse L. Barlow; Erwin H. Bareiss

2 norm. In this paper, we consider the least absolute deviation (LAD) solution and the least mixed norm (LMN) solution. For the LAD solution, both the data-fitting and the regularization terms are in the


Siam Journal on Scientific and Statistical Computing | 1988

Iterative Methods for Equality-Constrained Least Squares Problems

Jesse L. Barlow; Nancy Nichols; Robert J. Plemmons

\ell


SIAM Journal on Numerical Analysis | 1988

Error Analysis and Implementation Aspects of Deferred Correction for Equality Constrained Least Squares Problems

Jesse L. Barlow

1 norm. For the LMN solution, the regularization term is in the


Siam Journal on Scientific and Statistical Computing | 1987

Scaled givens rotations for the solution of linear least squares problems on systolic arrays

Jesse L. Barlow; Ilse C. F. Ipsen

\ell


Bit Numerical Mathematics | 1996

An algorithm and a stability theory for downdating the ULV decomposition

Jesse L. Barlow; Peter A. Yoon; Hongyuan Zha

1 norm but the data-fitting term is in the


Linear Algebra and its Applications | 2000

Optimal perturbation bounds for the Hermitian eigenvalue problem

Jesse L. Barlow; Ivan Slapničar

\ell


Siam Journal on Scientific and Statistical Computing | 1988

The Direct Solution of Weighted and Equality Constrained Least-Squares Problems

Jesse L. Barlow; Susan L. Handy

2 norm. Since images often have nonnegative intensity values, the proposed algorithms provide the option of taking into account the nonnegativity constraint. The LMN and LAD solutions are formulated as the solution to a linear or quadratic programming problem which is solved by interior point methods. At each iteration of the interior point method, a structured linear system must be solved. The preconditioned conjugate gradient method with factorized sparse inverse preconditioners is employed to solve such structured inner systems. Experimental results are used to demonstrate the effectiveness of our approach. We also show the quality of the restored images, using the minimization of mixed

Collaboration


Dive into the Jesse L. Barlow's collaboration.

Top Co-Authors

Avatar

Hasan Erbay

Kırıkkale University

View shared research outputs
Top Co-Authors

Avatar

Haoying Fu

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Peter A. Yoon

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Hongyuan Zha

Georgia Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Geunseop Lee

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Udaya B. Vemulapati

Pennsylvania State University

View shared research outputs
Top Co-Authors

Avatar

Michael K. Ng

Hong Kong Baptist University

View shared research outputs
Top Co-Authors

Avatar

Alicja Smoktunowicz

Warsaw University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge