Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Titley-Peloquin is active.

Publication


Featured researches published by David Titley-Peloquin.


SIAM Journal on Matrix Analysis and Applications | 2009

Stopping Criteria for the Iterative Solution of Linear Least Squares Problems

Xiao-Wen Chang; Christopher C. Paige; David Titley-Peloquin

We explain an interesting property of minimum residual iterative methods for the solution of the linear least squares (LS) problem. Our analysis demonstrates that the stopping criteria commonly used with these methods can in some situations be too conservative, causing any chosen method to perform too many iterations or even fail to detect that an acceptable iterate has been obtained. We propose a less conservative criterion to determine whether a given iterate is an acceptable LS solution. This is merely a sufficient condition, but it approaches a necessary condition in the limit as the given iterate approaches the exact LS solution. We also propose a necessary and sufficient condition to determine whether a given approximate LS solution is an acceptable LS solution, based on recent results on backward perturbation analysis of the LS problem. Although both of the above new conditions use quantities that are too expensive to compute in practical situations, we suggest potential approaches for estimating some of these quantities efficiently. We illustrate our results with several numerical examples.


SIAM Journal on Matrix Analysis and Applications | 2013

Sensitivity and Conditioning of the Truncated Total Least Squares Solution

Serge Gratton; David Titley-Peloquin; Jean Tshimanga Ilunga

We present an explicit expression for the condition number of the truncated total least squares (TLS) solution of


SIAM Journal on Matrix Analysis and Applications | 2010

Estimating the Backward Error in LSQR

Pavel Jiránek; David Titley-Peloquin

Ax \approx b


Numerical Linear Algebra With Applications | 2009

Backward perturbation analysis for scaled total least-squares problems

Xio-Wen Chang; David Titley-Peloquin

. This expression is obtained using the notion of the Frechet derivative. We also give upper bounds on the condition number, which are simple to compute and interpret. These results generalize those in the literature for the untruncated TLS problem. Numerical experiments demonstrate that our bounds are often a very good estimate of the condition number, and provide a significant improvement to known bounds.


Numerische Mathematik | 2017

A conjugate gradient like method for p-norm minimization in functional spaces

Claudio Estatico; Serge Gratton; Flavia Lenti; David Titley-Peloquin

We propose practical stopping criteria for the iterative solution of sparse linear least squares (LS) problems. Although we focus our discussion on the algorithm LSQR of Paige and Saunders, the ideas discussed here may also be applicable to other algorithms. We review why the 2-norm of the projection of the residual vector onto the range of


SIAM Journal on Matrix Analysis and Applications | 2014

Differentiating the Method of Conjugate Gradients

Serge Gratton; David Titley-Peloquin; Philippe L. Toint; Jean Tshimanga Ilunga

A


SIAM Journal on Matrix Analysis and Applications | 2012

On the Accuracy of the Karlson--Waldén Estimate of the Backward Error for Linear Least Squares Problems

Serge Gratton; Pavel Jiránek; David Titley-Peloquin

is a useful measure of convergence, and we show how this projection can be estimated efficiently at every iteration of LSQR. We also give practical and cheaply computable estimates of the backward error for the LS problem.


SIAM Journal on Matrix Analysis and Applications | 2008

Characterizing Matrices That Are Consistent with Given Solutions

Xiao-Wen Chang; Christopher C. Paige; David Titley-Peloquin

The scaled total least-squares (STLS) method unifies the ordinary least-squares (OLS), the total least-squares (TLS), and the data least-squares (DLS) methods. In this paper we perform a backward perturbation analysis of the STLS problem. This also unifies the backward perturbation analyses of the OLS, TLS and DLS problems. We derive an expression for an extended minimal backward error of the STLS problem. This is an asymptotically tight lower bound on the true minimal backward error. If the given approximate solution is close enough to the true STLS solution (as is the goal in practice), then the extended minimal backward error is in fact the minimal backward error. Since the extended minimal backward error is expensive to compute directly, we present a lower bound on it as well as an asymptotic estimate for it, both of which can be computed or estimated more efficiently. Our numerical examples suggest that the lower bound gives good order of magnitude approximations, while the asymptotic estimate is an excellent estimate. We show how to use our results to easily obtain the corresponding results for the OLS and DLS problems in the literature. Copyright


Operations Research | 2008

Visualizing and Constructing Cycles in the Simplex Method

David Avis; Bohdan Kaluzny; David Titley-Peloquin

We develop an iterative algorithm to recover the minimum p-norm solution of the functional linear equation


SIAM Journal on Matrix Analysis and Applications | 2018

Improved Bounds for Small-Sample Estimation

Serge Gratton; David Titley-Peloquin

Collaboration


Dive into the David Titley-Peloquin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pavel Jiránek

Technical University of Liberec

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chen Greif

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge