Alexander M. Rubinov
Federation University Australia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alexander M. Rubinov.
Siam Journal on Optimization | 1999
Alexander M. Rubinov; B. M. Glover; X. Q. Yang
The theory of increasing positively homogeneous functions defined on the positive orthant is applied to the class of decreasing functions. A multiplicative version of the inf-convolution operation is studied for decreasing functions. Modified penalty functions for some constrained optimization problems are introduced that are in general nonlinear with respect to the objective function of the original problem. As the perturbation function of a constrained optimization problem is decreasing, the theory of decreasing functions is subsequently applied to the study of modified penalty functions, the zero duality gap property, and the exact penalization.
Optimization | 2001
Alexander M. Rubinov; Ivan Singer
We study topical and sub-topical functions (i.e., functions which are increasing in the natural partial ordering of ℝn and additively homogeneous, respectively additively sub-homogeneous), and downward sets (i.e., subsets of ℝn which contain, along with each element, all smaller elements), in the framework of abstract convex analysis, with the aid of the additive min-type coupling function . We study primal and dual link between topical functions and closed downward sets, via plus-Minkowski gauges and ϕ-support functions of sets and level sets and ϕ-support sets of functions. Also, we give characterizations of topical functions in terms of their Fenchel-Moreau conjugates and biconjugates with respect to the above coupling function ϕ
Applied Mathematics Letters | 1999
Mikhail Andramonov; Alexander M. Rubinov; B. M. Glover
Abstract A generalization of the cutting plane method from convex minimization is proposed applicable to a very broad class of nonconvex global optimization problems. Convergence results are described along with details of the initial numerical implementation of the algorithms. In particular, we study minimization problems in which the objective function is increasing and convex-along-rays.
Journal of Global Optimization | 2004
Alexander M. Rubinov; Rafail N. Gasimov
We consider problems of vector optimization with preferences that are not necessarily a pre-order relation. We introduce the class of functions which can serve for a scalarization of these problems and consider a scalar duality based on recently developed methods for non-linear penalization scalar problems with a single constraint.
Optimization and Engineering | 2002
Adil M. Bagirov; Alexander M. Rubinov; John Yearwood
We reduce the classification problem to solving a global optimization problem and a method based on a combination of the cutting angle method and a local search is applied to the solution of this problem. The proposed method allows to solve classification problems for databases with an arbitrary number of classes. Numerical experiments have been carried out with databases of small to medium size. We present their results and provide comparisons of these results with those obtained by 29 different classification algorithms. The best performance overall was achieved with the global optimization method.
Journal of Global Optimization | 2002
Juan Enrique Martínez-Legaz; Alexander M. Rubinov; Ivan Singer
We develop a theory of downward subsets of the space ℝI, where I is a finite index set. Downward sets arise as the set of all solutions of a system of inequalities x∈ℝI,ft(x)≤0 (t∈T), where T is an arbitrary index set and each ft (t∈T) is an increasing function defined on ℝI. These sets play an important role in some parts of mathematical economics and game theory. We examine some functions related to a downward set (the distance to this set and the plus-Minkowski gauge of this set, which we introduce here) and study lattices of closed downward sets and of corresponding distance functions. We discuss two kinds of duality for downward sets, based on multiplicative and additive min-type functions, respectively, and corresponding separation properties, and we give some characterizations of best approximations by downward sets. Some links between the multiplicative and additive cases are established.
Annals of Operations Research | 2000
Adil M. Bagirov; Alexander M. Rubinov
In this paper we study a method for global optimization of increasing positively homogeneous functions over the unit simplex, which is a version of the cutting angle method. Some properties of the auxiliary subproblem are studied and a special algorithm for its solution is proposed. A cutting angle method based on this algorithm allows one to find an approximate solution of some problems of global optimization with 50 variables. Results of numerical experiments are discussed.
Optimization | 1999
Alexander M. Rubinov; B. M. Glover; Xiaoqi Yang
We consider problems of continuous constrainedoptimization in finite dimensional space and studygeneralized nonlinear Lagrangian and penaltyfunctions which are formed by increasing convolutionfunctions
Optimization Methods & Software | 1999
Alexander M. Rubinov; Mikhail Andramonov
We propose a general scheme of reduction of a Lipschitz programming problem to a problem of minimizing increasing convex-along-rays function. It is based on the positively homogeneous extension of degree p of the objective function and projective transformation of onto the unit simplex. The application of cutting angle method to Lipschitz programming is considered.
Journal of Global Optimization | 2004
Rafail N. Gasimov; Alexander M. Rubinov
We examine augmented Lagrangians for optimization problems with a single (either inequality or equality) constraint. We establish some links between augmented Lagrangians and Lagrange-type functions and propose a new kind of Lagrange-type functions for a problem with a single inequality constraint. Finally, we discuss a supergradient algorithm for calculating optimal values of dual problems corresponding to some class of augmented Lagrangians.