Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where M. R. Osborne is active.

Publication


Featured researches published by M. R. Osborne.


Journal of Computational and Graphical Statistics | 2000

On the LASSO and its Dual

M. R. Osborne; Brett Presnell; Berwin A. Turlach

Abstract Proposed by Tibshirani, the least absolute shrinkage and selection operator (LASSO) estimates a vector of regression coefficients by minimizing the residual sum of squares subject to a constraint on the l 1-norm of the coefficient vector. The LASSO estimator typically has one or more zero elements and thus shares characteristics of both shrinkage estimation and variable selection. In this article we treat the LASSO as a convex programming problem and derive its dual. Consideration of the primal and dual problems together leads to important new insights into the characteristics of the LASSO estimator and to an improved method for estimating its covariance matrix. Using these results we also develop an efficient algorithm for computing LASSO estimates which is usable even in cases where the number of regressors exceeds the number of observations. An S-Plus library based on this algorithm is available from StatLib.


Technometrics | 1985

Finite Algorithms in Optimization and Data Analysis

M. R. Osborne

Preface Table of Notation Some Results from Convex analysis Linear Programming Applications of linear Programming in Discrete Approximation Polyhedral Convex Functions Least Squares and Related Methods Some Applications to Non-Convex Problems Some Questions of Complexity and Performance Appendices References Index.


SIAM Journal on Scientific Computing | 1995

A modified Prony algorithm for exponential function fitting

M. R. Osborne; Gordon K. Smyth

A modification of the classical technique of Prony for fitting sums of exponential functions to data is considered. The method maximizes the likelihood for the problem (unlike the usual implementation of Prony’s method, which is not even consistent for transient signals), proves to be remarkably effective in practice, and is supported by an asymptotic stability result. Novel features include a discussion of the problem parametrization and its implications for consistency. The asymptotic convergence proofs are made possible by an expression for the algorithm in terms of circulant divided difference operators.


The Journal of The Australian Mathematical Society. Series B. Applied Mathematics | 1976

Nonlinear least squares — the Levenberg algorithm revisited

M. R. Osborne

One of the most succesful algorithims for nonlinear least squares calculations is that associated with the names of Levenberg, Marquardt, and Morrison. This algorithim gives a method which depends nonlinearly on a parameter γ for computing the correction to the current point. In this paper an attempt is made to give a rule for choosing γ which (a) permits a satisfactory convergence theorem to be proved, and (b) is capable of satisfactory computer implementation. It is beleieved that the stated aims have been met with reasonable success. The convergence theorem is both simple and global in character, and a computer code is given which appears to be at least competitive with existing alternatives.


SIAM Journal on Numerical Analysis | 1975

Some Special Nonlinear Least Squares Problems

M. R. Osborne

A least squares problem is called separable if the fitting function can be written as a linear combination of functions involving further parameters in a nonlinear manner. Here it is shown that a separable problem can be transformed to a minimization problem involving the nonlinear parameters only. This result can be interpreted as a generalization of the classical technique of Prony, and it also shows how the numerical difficulties associated with Prony’s method can be overcome. The transformation is then worked out in detail for two examples (exponential fitting to equispaced data and rational fitting). In both cases the condition for a stationary value of the transformed problem leads to a nonlinear eigenvalue problem. Two algorithms are then suggested and illustrated by numerical examples.


SIAM Journal on Numerical Analysis | 1983

Analysis of Newton’s Method at Irregular Singularities

Andreas Griewank; M. R. Osborne

Recent work on Newton’s method at singularities of the Jacobian has established linear convergence under certain regularity assumptions. Here, Newton’s method is analyzed in the neighborhood of irregular singularities which include all minimizers at which the Hessian has a one-dimensional null space. Depending on a parameter specific to the underlying optimization problem or system of simultaneous equations, Newton’s method is found either to converge with a limiting ratio of about


Siam Journal on Scientific and Statistical Computing | 1991

A modified Prony algorithm for fitting functions defined by difference equations

M. R. Osborne; Gordon K. Smyth

\frac{2} {3}


Journal of Computational and Graphical Statistics | 1992

On the Consistency of Prony's Method and Related Algorithms

Margaret Kahn; M. S. Mackisack; M. R. Osborne; Gordon K. Smyth

, or to diverge from arbitrarily close starting points, or to behave in a certain sense chaotically.


International Statistical Review | 1992

Fisher's Method of Scoring

M. R. Osborne

This paper reformulates, generalizes, and investigates the stability of the modified Prony algorithm introduced by Osborne [SIAM J. Numer. Anal. 12 (1975), pp. 571–592], with special reference to rational and exponential fitting. The algorithm, originally for exponential functions, is generalized to the least squares fitting of any function which satisfies a linear homogeneous difference equation. Using the difference equation formulation, the problem is expressed as a separable regression, and hence as a nonlinear eigenproblem in terms of the coefficients of the difference equation. The eigenproblem involves finding the null space of a matrix of data differences B, and is solved using a variant of inverse iteration. Stability of the algorithm is shown to depend on the fact that B closely approximates the Hessian of the sum of squares. The expectations of B and the Hessian are evaluated. In the case of rational fitting, the relative difference between B and the Hessian is shown to converge to zero almost surely. Some details of the implementation of the algorithm are given. A simulation study compares the modified Prony algorithm with the Levenberg algorithm on a rational fitting problem, and supports the theoretical results.


SIAM Journal on Numerical Analysis | 1988

A Riccati transformation method for solving linear BVPs. I: theoretical aspects

Luca Dieci; M. R. Osborne; Robert D. Russell

Abstract Modifications of Pronys classical technique for estimating rate constants in exponential fitting problems have many contemporary applications. In this article the consistency of Pronys method and of related algorithms based on maximum likelihood is discussed as the number of observations n → ∞ by considering the simplest possible models for fitting sums of exponentials to observed data. Two sampling regimes are relevant, corresponding to transient problems and problems of frequency estimation, each of which is associated with rather different kinds of behavior. The general pattern is that the stronger results are obtained for the frequency estimation problem. However, the algorithms considered are all scaling dependent and consistency is not automatic. A new feature that emerges is the importance of an appropriate choice of scale in order to ensure consistency of the estimates in certain cases. The tentative conclusion is that algorithms referred to as Objective function Reweighting Algorithms ...

Collaboration


Dive into the M. R. Osborne's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Markus Hegland

Australian National University

View shared research outputs
Top Co-Authors

Avatar

David L. Harrar

Australian National University

View shared research outputs
Top Co-Authors

Avatar

G. A. Watson

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Gordon K. Smyth

Walter and Eliza Hall Institute of Medical Research

View shared research outputs
Top Co-Authors

Avatar

Richard P. Brent

Australian National University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Berwin A. Turlach

University of Western Australia

View shared research outputs
Top Co-Authors

Avatar

Krisorn Jittorntrum

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Margaret Kahn

Australian National University

View shared research outputs
Researchain Logo
Decentralizing Knowledge