Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jay S. Treiman is active.

Publication


Featured researches published by Jay S. Treiman.


Siam Journal on Optimization | 2003

An Extended Extremal Principle with Applications to Multiobjective Optimization

Boris S. Mordukhovich; Jay S. Treiman; Qiji J. Zhu

We develop an extended version of the extremal principle in variational analysis that can be treated as a variational counterpart to the classical separation results in the case of nonconvex sets and which plays an important role in the generalized differentiation theory and its applications to optimization-related problems. The main difference between the conventional extremal principle and the extended version developed below is that, instead of the translation of sets involved in the extremal systems, we allow deformations. The new version seems to be more flexible in various applications and covers, in particular, multiobjective optimization problems with general preference relations. In this way we obtain new necessary optimality conditions for constrained problems of multiobjective optimization with nonsmooth data and also for multiplayer multiobjective games.


Transactions of the American Mathematical Society | 1998

Necessary Conditions for Constrained Optimization Problems with Semicontinuous and Continuous Data

Jonathan M. Borwein; Jay S. Treiman; Qiji J. Zhu

We consider nonsmooth constrained optimization problems with semicontinuous and continuous data in Banach space and derive necessary conditions {\sl without} constraint qualification in terms of smooth subderivatives and normal cones. These results, in different versions, are set in reflexive and smooth Banach spaces.


Siam Journal on Control and Optimization | 1999

Lagrange Multipliers for Nonconvex Generalized Gradients with Equality, Inequality, and Set Constraints

Jay S. Treiman

A Lagrange multiplier rule for finite dimensional Lipschitz problems that uses a nonconvex generalized gradient is proven. This result uses either both the linear generalized gradient and the generalized gradient of Mordukhovich or the linear generalized gradient and a qualification condition involving the pseudo-Lipschitz behavior of the feasible set under perturbations. The optimization problem includes equality constraints, inequality constraints, and a set constraint. This result extends known nonsmooth results for the Lipschitz case.


Siam Journal on Optimization | 1995

The Linear Nonconvex Generalized Gradient and Lagrange Multipliers

Jay S. Treiman

A Lagrange multiplier rule that uses small generalized gradients is introduced. It includes both inequality and set constraints. The generalized gradient is the linear generalized gradient. It is smaller than the generalized gradients of Clarke and Mordukhovich but retains much of their nice calculus. Its convex hull is the generalized gradient of Michel and Penot if a function is Lipschitz.The tools used in the proof of this Lagrange multiplier result are a coderivative, a chain rule, and a scalarization formula for this coderivative. Many smooth and nonsmooth Lagrange multiplier results are corollaries of this result.It is shown that the technique in this paper can be used for cases of equality, inequality, and set constraints if one considers the generalized gradient of Mordukhovich. An open question is: Does a Lagrange multiplier result hold when one has equality constraints and uses the linear generalized gradient?


Transactions of the American Mathematical Society | 1986

Clarke’s gradients and epsilon-subgradients in Banach spaces

Jay S. Treiman

A new characterization of Clarkes normal cone to a closed set in a Banach space is given. The normal cone is characterized in terms of weak-star limits of epsilon normals. A similar characterization of Clarkes generalized gradients is also presented. Restrictions must be placed on the Banach spaces to make the formulas valid.


Nonlinear Analysis-theory Methods & Applications | 1999

Partially smooth variational principles and applications

Jonathan M. Borwein; Jay S. Treiman; Qiji J. Zhu

We discuss a smooth variational principle for partially smooth viscosity subdifferentials and explore its applications in nonsmooth analysis.


Siam Journal on Control and Optimization | 1990

Optimal control with small generalized gradients

Jay S. Treiman

One of the main uses of generalized gradients is obtaining tight optimality conditions for optimal control problems. In this paper it is shown that the B-gradients satisfy the formula stating that the generalized gradients of an integral functional are contained in the “integral” of the generalized gradients. This formula is then applied to derive Euler–Lagrange equations and optimality conditions for a differential inclusion problem. All of these conditions can be stated in simple forms.


Archive | 2014

Applications of Integration

Jay S. Treiman

Work can be defined as the change of energy in a system. In an elementary physics class it is often calculated as force times distance for the movement of a mass along a straight line under the application of a constant force. For example, if a 2 kg mass is moved up 2 m from the surface of the earth we can approximate the force on the mass as \(F = 9.8 \cdot 2 = 19.6\) N. Since the mass is moved 2 m, the energy involved in moving the mass is \(E = 19.6 \cdot 2 = 39.2\) J. This section begins with a review of the ideas in Sect. 1.3 on page 19.


Archive | 2014

More on Limits

Jay S. Treiman

There are several other situations where the idea of a limit is needed. This section states the definitions for some of these situations and gives a theorem that is useful when trying to find limits.


Archive | 2014

Limits and Derivatives

Jay S. Treiman

The idea of a limit is central to all of calculus. Throughout the rest of your calculus classes it will be behind everything you learn. Most people do not understand this concept the first time they see it, but they can get a feeling for the basic idea with some effort.

Collaboration


Dive into the Jay S. Treiman's collaboration.

Top Co-Authors

Avatar

Qiji J. Zhu

Western Michigan University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Roxin Zhang

Northern Michigan University

View shared research outputs
Top Co-Authors

Avatar

Yuri S. Ledyaev

Western Michigan University

View shared research outputs
Researchain Logo
Decentralizing Knowledge