Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Zhiyou Wu is active.

Publication


Featured researches published by Zhiyou Wu.


Mathematical Programming | 2007

Non-convex quadratic minimization problems with quadratic constraints: global optimality conditions

V. Jeyakumar; Alex M. Rubinov; Zhiyou Wu

In this paper, we first examine how global optimality of non-convex constrained optimization problems is related to Lagrange multiplier conditions. We then establish Lagrange multiplier conditions for global optimality of general quadratic minimization problems with quadratic constraints. We also obtain necessary global optimality conditions, which are different from the Lagrange multiplier conditions for special classes of quadratic optimization problems. These classes include weighted least squares with ellipsoidal constraints, and quadratic minimization with binary constraints. We discuss examples which demonstrate that our optimality conditions can effectively be used for identifying global minimizers of certain multi-extremal non-convex quadratic optimization problems.


Journal of Global Optimization | 2006

Sufficient Global Optimality Conditions for Non-convex Quadratic Minimization Problems With Box Constraints

V. Jeyakumar; Alex M. Rubinov; Zhiyou Wu

In this paper we establish conditions which ensure that a feasible point is a global minimizer of a quadratic minimization problem subject to box constraints or binary constraints. In particular, we show that our conditions provide a complete characterization of global optimality for non-convex weighted least squares minimization problems. We present a new approach which makes use of a global subdifferential. It is formed by a set of functions which are not necessarily linear functions, and it enjoys explicit descriptions for quadratic functions. We also provide numerical examples to illustrate our optimality conditions.


Journal of Global Optimization | 2007

A filled function method for constrained global optimization

Zhiyou Wu; Fu-heng. Bai; H. W. J. Lee; Y. J. Yang

In this paper, a filled function method for solving constrained global optimization problems is proposed. A filled function is proposed for escaping the current local minimizer of a constrained global optimization problem by combining the idea of filled function in unconstrained global optimization and the idea of penalty function in constrained optimization. Then a filled function method for obtaining a global minimizer or an approximate global minimizer of the constrained global optimization problem is presented. Some numerical results demonstrate the efficiency of this global optimization method for solving constrained global optimization problems.


Optimization | 2004

An exact lower order penalty function and its smoothing in nonlinear programming

Zhiyou Wu; Fu-Sheng Bai; Xiaoqi Yang; Lian-Sheng Zhang

In this article, we consider a lower order penalty function and its ε-smoothing for an inequality constrained nonlinear programming problem. It is shown that any strict local minimum satisfying the second-order sufficiency condition for the original problem is a strict local minimum of the lower order penalty function with any positive penalty parameter. By using an ε-smoothing approximation to the lower order penalty function, we get a modified smooth global exact penalty function under mild assumptions.


Journal of Global Optimization | 2006

Liberating the Subgradient Optimality Conditions from Constraint Qualifications

V. Jeyakumar; Zhiyou Wu; G. M. Lee; N. Dinh

In convex optimization the significance of constraint qualifications is evidenced by the simple duality theory, and the elegant subgradient optimality conditions which completely characterize a minimizer. However, the constraint qualifications do not always hold even for finite dimensional optimization problems and frequently fail for infinite dimensional problems. In the present work we take a broader view of the subgradient optimality conditions by allowing them to depend on a sequence of ε-subgradients at a minimizer and then by letting them to hold in the limit. Liberating the optimality conditions in this way permits us to obtain a complete characterization of optimality without a constraint qualification. As an easy consequence of these results we obtain optimality conditions for conic convex optimization problems without a constraint qualification. We derive these conditions by applying a powerful combination of conjugate analysis and ε-subdifferential calculus. Numerical examples are discussed to illustrate the significance of the sequential conditions.


Journal of Global Optimization | 2015

Gradient-free method for nonsmooth distributed optimization

Jueyou Li; Changzhi Wu; Zhiyou Wu; Qiang Long

In this paper, we consider a distributed nonsmooth optimization problem over a computational multi-agent network. We first extend the (centralized) Nesterov’s random gradient-free algorithm and Gaussian smoothing technique to the distributed case. Then, the convergence of the algorithm is proved. Furthermore, an explicit convergence rate is given in terms of the network size and topology. Our proposed method is free of gradient, which may be preferred by practical engineers. Since only the cost function value is required, our method may suffer a factor up to


Applied Mathematics and Computation | 2006

A new filled function method for nonlinear integer programming problem

Y. H. Gu; Zhiyou Wu


Neurocomputing | 2016

Distributed mirror descent method for multi-agent optimization with delay

Jueyou Li; Guo Chen; Zhao Yang Dong; Zhiyou Wu

d


Optimization | 2009

Global optimality conditions for mixed nonconvex quadratic programs

Zhiyou Wu; Fu-Sheng Bai


Journal of Global Optimization | 2007

Sufficient global optimality conditions for weakly convex minimization problems

Zhiyou Wu

d (the dimension of the agent) in convergence rate over that of the distributed subgradient-based methods in theory. However, our numerical simulations show that for some nonsmooth problems, our method can even achieve better performance than that of subgradient-based methods, which may be caused by the slow convergence in the presence of subgradient.

Collaboration


Dive into the Zhiyou Wu's collaboration.

Top Co-Authors

Avatar

Fu-Sheng Bai

Chongqing Normal University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

V. Jeyakumar

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

Jueyou Li

Chongqing Normal University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Guoquan Li

Chongqing Normal University

View shared research outputs
Top Co-Authors

Avatar

Alex M. Rubinov

Federation University Australia

View shared research outputs
Researchain Logo
Decentralizing Knowledge