Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jin Yun Yuan is active.

Publication


Featured researches published by Jin Yun Yuan.


Bit Numerical Mathematics | 2001

SOR-like Methods for Augmented Systems

Gene H. Golub; Xiaonan Wu; Jin Yun Yuan

Several SOR-like methods are proposed for solving augmented systems. These have many different applications in scientific computing, for example, constrained optimization and the finite element method for solving the Stokes equation. The convergence and the choice of optimal parameter for these algorithms are studied. The convergence and divergence regions for some algorithms are given, and the new algorithms are applied to solve the Stokes equations as well.


Computational Optimization and Applications | 2002

Modified Two-Point Stepsize Gradient Methods for Unconstrained Optimization

Yu-Hong Dai; Jin Yun Yuan; Ya-Xiang Yuan

For unconstrained optimization, the two-point stepsize gradient method is preferable over the classical steepest descent method both in theory and in real computations. In this paper we interpret the choice for the stepsize in the two-point stepsize gradient method from the angle of interpolation and propose two modified two-point stepsize gradient methods. The modified methods are globally convergent under some mild assumptions on the objective function. Numerical results are reported, which suggest that improvements have been achieved.


Journal of Computational and Applied Mathematics | 1998

Block SOR methods for rank-deficient least-squares problems

C. H. Santos; B. P. B. Silva; Jin Yun Yuan

Many papers have discussed preconditioned block iterative methods for solving full rank least-squares problems. However very few papers studied iterative methods for solving rank-deficient least-squares problems. Miller and Neumann (1987) proposed the 4-block SOR method for solving the rank-deficient problem. Here a 2-block SOR method and a 3-block SOR method are proposed to solve such problem. The convergence of the block SOR methods is studied. The optimal parameters are determined. Comparison between the 2-block SOR method and the 3-block SOR method is given also.


Numerical Algorithms | 2004

Conjugate Gradient Method for Rank Deficient Saddle Point Problems

X. Wu; B.P.B. Silva; Jin Yun Yuan

We propose an alternative iterative method to solve rank deficient problems arising in many real applications such as the finite element approximation to the Stokes equation and computational genetics. Our main contribution is to transform the rank deficient problem into a smaller full rank problem, with structure as sparse as possible. The new system improves the condition number greatly. Numerical experiments suggest that the new iterative method works very well for large sparse rank deficient saddle point problems.


Journal of Computational and Applied Mathematics | 1996

Preconditioned conjugate gradient method for generalized least squares problems

Jin Yun Yuan; Alfredo N. Iusem

Abstract A variant of the preconditioned conjugate gradient method to solve generalized least squares problems is presented. If the problem is min ( Ax − b ) T W −1 ( Ax − b ) with A ∈ R m × n and W ∈ R m × m symmetric and positive definite, the method needs only a preconditioner A 1 ∈ R n × n , but not the inverse of matrix W or of any of its submatrices. Freunds comparison result for regular least squares problems is extended to generalized least squares problems. An error bound is also given.


Applied Mathematics and Computation | 1999

Convergence of the generalized AOR method

Jin Yun Yuan; Xiao-Qing Jin

A generalized AOR (GAOR) method is proposed here to solve the linear system with some special structure. The convergence of the GAOR method is studied. The convergence result obtained here is an extension of the classical convergence result of the regular AOR method with weaker condition. In comparing with the generalized SOR (GSOR) method, we show that the GAOR method is better than the GSOR method under certain conditions. In some cases, the GAOR method is convergent even if the GSOR method does not converge. Finally, the GAOR method is applied to solve the generalized least-squares (LS) problems. Three numerical algorithms for the generalized LS problems are given.


SIAM Journal on Numerical Analysis | 2006

On a Linearized Backward Euler Method for the Equations of Motion of Oldroyd Fluids of Order One

Amiya K. Pani; Jin Yun Yuan; Pedro Damazio

In this paper, a linearized backward Euler method is discussed for the equations of motion arising in the Oldroyd model of viscoelastic fluids. Some new a priori bounds are obtained for the solution under realistically assumed conditions on the data. Further, the exponential decay properties for the exact as well as the discrete solutions are established. Finally, a priori error estimates in


Mathematical Programming | 2015

On the convergence and worst-case complexity of trust-region and regularization methods for unconstrained optimization

Geovani Nunes Grapiglia; Jin Yun Yuan; Ya-Xiang Yuan

\bH^1


Applied Mathematics and Computation | 1997

Trust region algorithm for nonsmooth optimization

R.J.B. de Sampaio; Jin Yun Yuan; Wenyu Sun

and


Journal of Mathematical Chemistry | 2014

A computational modeling of two dimensional reaction-diffusion Brusselator system arising in chemical processes

Ram Jiwari; Jin Yun Yuan

\bL^2

Collaboration


Dive into the Jin Yun Yuan's collaboration.

Top Co-Authors

Avatar

Ya-Xiang Yuan

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Wenyu Sun

Nanjing Normal University

View shared research outputs
Top Co-Authors

Avatar

Amiya K. Pani

Indian Institute of Technology Bombay

View shared research outputs
Top Co-Authors

Avatar

Raimundo J. B. de Sampaio

Pontifícia Universidade Católica do Paraná

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Pedro Damazio

Federal University of Paraná

View shared research outputs
Top Co-Authors

Avatar

Tong Zhang

Federal University of Paraná

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge