Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yasushi Narushima is active.

Publication


Featured researches published by Yasushi Narushima.


Siam Journal on Optimization | 2011

A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization

Yasushi Narushima; Hiroshi Yabe; John A. Ford

Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed method. Moreover, we present a specific three-term conjugate gradient method based on the multistep quasi-Newton method. Finally, some numerical results of the proposed method are given.


Journal of Computational and Applied Mathematics | 2012

Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization

Yasushi Narushima; Hiroshi Yabe

Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a descent search direction. On the other hand, Hager and Zhang (2005) proposed another conjugate gradient method which always generates a descent search direction. In this paper, combining Dai-Liaos idea and Hager-Zhangs idea, we propose conjugate gradient methods based on secant conditions that generate descent search directions. In addition, we prove global convergence properties of the proposed methods. Finally, preliminary numerical results are given.


Computational Optimization and Applications | 2006

Global Convergence of a Memory Gradient Method for Unconstrained Optimization

Yasushi Narushima; Hiroshi Yabe

Memory gradient methods are used for unconstrained optimization, especially large scale problems. The first idea of memory gradient methods was proposed by Miele and Cantrell (1969) and Cragg and Levy (1969). In this paper, we present a new memory gradient method which generates a descent search direction for the objective function at every iteration. We show that our method converges globally to the solution if the Wolfe conditions are satisfied within the framework of the line search strategy. Our numerical results show that the proposed method is efficient for given standard test problems if we choose a good parameter included in the method.


Computational Optimization and Applications | 2015

A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization

Mehiddin Al-Baali; Yasushi Narushima; Hiroshi Yabe

Recently, conjugate gradient methods, which usually generate descent search directions, are useful for large-scale optimization. Narushima et al. (SIAM J Optim 21:212–230, 2011) have proposed a three-term conjugate gradient method which satisfies a sufficient descent condition. We extend this method to two parameters family of three-term conjugate gradient methods which can be used to control the magnitude of the directional derivative. We show that these methods converge globally and work well for suitable choices of the parameters. Numerical results are also presented.


Journal of Computational and Applied Mathematics | 2010

Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems

Michiya Kobayashi; Yasushi Narushima; Hiroshi Yabe

In this paper, we deal with conjugate gradient methods for solving nonlinear least squares problems. Several Newton-like methods have been studied for solving nonlinear least squares problems, which include the Gauss-Newton method, the Levenberg-Marquardt method and the structured quasi-Newton methods. On the other hand, conjugate gradient methods are appealing for general large-scale nonlinear optimization problems. By combining the structured secant condition and the idea of Dai and Liao (2001) [20], the present paper proposes conjugate gradient methods that make use of the structure of the Hessian of the objective function of nonlinear least squares problems. The proposed methods are shown to be globally convergent under some assumptions. Finally, some numerical results are given.


Optimization Letters | 2012

Conjugate gradient methods using value of objective function for unconstrained optimization

Hideaki Iiduka; Yasushi Narushima

Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modification to the direction, one of our methods performs as well as the best conventional method employing the Hestenes–Stiefel formula.


Abstract and Applied Analysis | 2013

A Smoothing Method with Appropriate Parameter Control Based on Fischer-Burmeister Function for Second-Order Cone Complementarity Problems

Yasushi Narushima; Hideho Ogasawara; Shunsuke Hayashi

We deal with complementarity problems over second-order cones. The complementarity problem is an important class of problems in the real world and involves many optimization problems. The complementarity problem can be reformulated as a nonsmooth system of equations. Based on the smoothed Fischer-Burmeister function, we construct a smoothing Newton method for solving such a nonsmooth system. The proposed method controls a smoothing parameter appropriately. We show the global and quadratic convergence of the method. Finally, some numerical results are given.


international conference on advanced applied informatics | 2017

Economic Experiments in YBG: The Case of Manufacturing Industry Game

Yasushi Narushima

In this paper, we inspect whether Yokohama Business Game (YBG) is an effective tool for economic experiments. For this purpose, we develop a game about manufacturing industries by using YBG. By analyzing the results of proposed game by using the evolutionary game theory, we compare the results of the game with the theoretical consideration.


Optimization Methods & Software | 2017

Descent three-term conjugate gradient methods based on secant conditions for unconstrained optimization

Hiroshi Kobayashi; Yasushi Narushima; Hiroshi Yabe

The conjugate gradient method is an effective method for large-scale unconstrained optimization problems. Recent research has proposed conjugate gradient methods based on secant conditions to establish fast convergence of the methods. However, these methods do not always generate a descent search direction. In contrast, Y. Narushima, H. Yabe, and J.A. Ford [A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM J. Optim. 21 (2011), pp. 212–230] proposed a three-term conjugate gradient method which always satisfies the sufficient descent condition. This paper makes use of both ideas to propose descent three-term conjugate gradient methods based on particular secant conditions, and then shows their global convergence properties. Finally, numerical results are given.


Journal of Industrial and Management Optimization | 2017

Memoryless quasi-Newton methods based on spectral-scaling Broyden family for unconstrained optimization

Shummin Nakayama; Yasushi Narushima; Hiroshi Yabe

Memoryless quasi-Newton methods are studied for solving large-scale unconstrained optimization problems. Recently, memoryless quasi-Newton methods based on several kinds of updating formulas were proposed. Since the methods closely related to the conjugate gradient method, the methods are promising. In this paper, we propose a memoryless quasi-Newton method based on the Broyden family with the spectral-scaling secant condition. We focus on the convex and preconvex classes of the Broyden family, and we show that the proposed method satisfies the sufficient descent condition and converges globally. Finally, some numerical experiments are given.

Collaboration


Dive into the Yasushi Narushima's collaboration.

Top Co-Authors

Avatar

Hiroshi Yabe

Tokyo University of Science

View shared research outputs
Top Co-Authors

Avatar

Hideho Ogasawara

Tokyo University of Science

View shared research outputs
Top Co-Authors

Avatar

Shummin Nakayama

Tokyo University of Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Atsushi Kato

Tokyo University of Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge