Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Vera Roshchina is active.

Publication


Featured researches published by Vera Roshchina.


Optimization | 2008

Exhausters and subdifferentials in non-smooth analysis

Vladimir F. Demyanov; Vera Roshchina

Non-smooth analysis manifested itself in the 1960s of the last century and is still gaining momentum developing new tools and harnesses and covering new areas of application. One of the notions of non-smooth analysis is that of the exhauster. The exhauster represents a dual construction in Nonsmooth Analysis. The relationships between upper and lower exhausters and various subdifferentials of non-smooth functions are discussed in this article. It is shown that exhausters are closely related to other non-smooth tools, such as the Michel–Penot, Clarke, Gâteaux and Fréchet subdifferentials. Formulae for computing these subdifferentials by means of exhausters are obtained. The discovered relations are all in the form of equalities, i.e. a calculus for computing the mentioned subdifferentials by means of exhausters is provided.


Journal of Global Optimization | 2008

Exhausters, optimality conditions and related problems

Vladimir F. Demyanov; Vera Roshchina

The notions of exhausters were introduced in (Demyanov, Exhauster of a positively homogeneous function, Optimization 45, 13–29 (1999)). These dual tools (upper and lower exhausters) can be employed to describe optimality conditions and to find directions of steepest ascent and descent for a very wide range of nonsmooth functions. What is also important, exhausters enjoy a very good calculus (in the form of equalities). In the present paper we review the constrained and unconstrained optimality conditions in terms of exhausters, introduce necessary and sufficient conditions for the Lipschitzivity and Quasidifferentiability, and also present some new results on relationships between exhausters and other nonsmooth tools (such as the Clarke, Michel-Penot and Fréchet subdifferentials).


Siam Journal on Optimization | 2010

Applying Metric Regularity to Compute a Condition Measure of a Smoothing Algorithm for Matrix Games

Boris S. Mordukhovich; Javier Peña; Vera Roshchina

We develop an approach of variational analysis and generalized differentiation to conditioning issues for two-person zero-sum matrix games. Our major results establish precise relationships between a certain condition measure of the smoothing first-order algorithm proposed by Gilpin, Pena, and Sandholm [Proceedings of the 23rd Conference on Artificial Intelligence, 2008, pp. 75-82] and the exact bound of metric regularity for an associated set-valued mapping. In this way we compute the aforementioned condition measure in terms of the initial matrix game data.


Optimization Methods & Software | 2008

A new class of quasi-Newton updating formulas

Dong-Hui Li; Liqun Qi; Vera Roshchina

In this paper, we propose a derivative-free quasi-Newton condition, which results in a new class of quasi-Newton updating formulas for unconstrained optimization. Each updating formula in this class is a rank-two updating formula and preserves the positive definiteness of the second derivative matrix of the quadratic model. Its first two terms are the same as the first two terms of the BFGS updating formula. We establish global convergence of quasi-Newton methods based upon the updating formulas in this class, and superlinear convergence of a special quasi-Newton method among them. Then we propose a special quasi-Newton updating formula, which repetitively uses the new quasi-Newton condition. This updating formula is derivative-free. Numerical results are reported. Dedicated to Professor M.J.D. Powell on the occasion of his 70th birthday


Journal of Optimization Theory and Applications | 2014

Directed Subdifferentiable Functions and the Directed Subdifferential Without Delta-Convex Structure

Robert Baier; Elza Farkhi; Vera Roshchina

We show that the directed subdifferential introduced for differences of convex (delta-convex, DC) functions by Baier and Farkhi can be constructed from the directional derivative without using any information on the delta-convex structure of the function. The new definition extends to a more general class of functions, which includes Lipschitz functions definable on o-minimal structure and quasidifferentiable functions.


Mathematics of Computation | 2013

Fast computation of zeros of polynomial systems with bounded degree under finite-precision

Irénée Briquel; Felipe Cucker; Javier Peña; Vera Roshchina

A solution for Smales 17th problem, for the case of systems with bounded degree was recently given. This solution, an algorithm computing approximate zeros of complex polynomial systems in average polynomial time, assumed infinite precision. In this paper we describe a finite-precision version of this algorithm. Our main result shows that this version works within the same time bounds and requires a precision which, on the average, amounts to a polynomial amount of bits in the mantissa of the intervening floating-point numbers.


Journal of Optimization Theory and Applications | 2015

On Local Coincidence of a Convex Set and its Tangent Cone

Kaiwen Meng; Vera Roshchina; Xiaoqi Yang

In this paper, we introduce the exact tangent approximation property for a convex set and provide its characterizations, including the nonzero extent of a convex set. We obtain necessary and sufficient conditions for the closedness of the positive hull of a convex set via a limit set defined by truncated upper level sets of the gauge function. We also apply the exact tangent approximation property to study the existence of a global error bound for a proper, lower semicontinuous and positively homogeneous function.


Siam Journal on Optimization | 2014

Facially Exposed Cones Are Not Always Nice

Vera Roshchina

We address the conjecture proposed by Gabor Pataki that every facially exposed cone is nice. We show that the conjecture is true in the three-dimensional case; however, there exists a four-dimensional counterexample of a cone that is facially exposed but is not nice.


Optimization | 2017

Outer limits of subdifferentials for min–max type functions

Andrew Eberhard; Vera Roshchina; Tian Sang

Abstract We generalize the outer subdifferential construction suggested by Cánovas, Henrion, López and Parra for max type functions to pointwise minima of regular Lipschitz functions. We also answer an open question about the relation between the outer subdifferential of the support of a regular function and the end set of its subdifferential posed by Li, Meng and Yang.


Optimization Letters | 2014

Some preconditioners for systems of linear inequalities

Javier Peña; Vera Roshchina; Negar Soheili

We show that a combination of two simple preprocessing steps would generally improve the conditioning of a homogeneous system of linear inequalities. Our approach is based on a comparison among three different but related notions of conditioning for linear inequalities.

Collaboration


Dive into the Vera Roshchina's collaboration.

Top Co-Authors

Avatar

Javier Peña

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Felipe Cucker

City University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Vladimir F. Demyanov

Saint Petersburg State University

View shared research outputs
Top Co-Authors

Avatar

Liqun Qi

Hong Kong Polytechnic University

View shared research outputs
Top Co-Authors

Avatar

Xiaoqi Yang

Hong Kong Polytechnic University

View shared research outputs
Researchain Logo
Decentralizing Knowledge