Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kamil A. Khan is active.

Publication


Featured researches published by Kamil A. Khan.


Optimization Methods & Software | 2015

A vector forward mode of automatic differentiation for generalized derivative evaluation

Kamil A. Khan; Paul I. Barton

Numerical methods for non-smooth equation-solving and optimization often require generalized derivative information in the form of elements of the Clarke Jacobian or the B-subdifferential. It is shown here that piecewise differentiable functions are lexicographically smooth in the sense of Nesterov, and that lexicographic derivatives of these functions comprise a particular subset of both the B-subdifferential and the Clarke Jacobian. Several recently developed methods for generalized derivative evaluation of composite piecewise differentiable functions are shown to produce identical results, which are also lexicographic derivatives. A vector forward mode of automatic differentiation (AD) is presented for evaluation of these derivatives, generalizing established methods and combining their computational benefits. This forward AD mode may be applied to any finite composition of known smooth functions, piecewise differentiable functions such as the absolute value function, , and , and certain non-smooth functions which are not piecewise differentiable, such as the Euclidean norm. This forward AD mode may be implemented using operator overloading, does not require storage of a computational graph, and is computationally tractable relative to the cost of a function evaluation. An implementation in C is discussed.


Journal of Optimization Theory and Applications | 2014

Generalized Derivatives for Solutions of Parametric Ordinary Differential Equations with Non-differentiable Right-Hand Sides

Kamil A. Khan; Paul I. Barton

Sensitivity analysis provides useful information for equation-solving, optimization, and post-optimality analysis. However, obtaining useful sensitivity information for systems with nonsmooth dynamic systems embedded is a challenging task. In this article, for any locally Lipschitz continuous mapping between finite-dimensional Euclidean spaces, Nesterov’s lexicographic derivatives are shown to be elements of the plenary hull of the (Clarke) generalized Jacobian whenever they exist. It is argued that in applications, and in several established results in nonsmooth analysis, elements of the plenary hull of the generalized Jacobian of a locally Lipschitz continuous function are no less useful than elements of the generalized Jacobian itself. Directional derivatives and lexicographic derivatives of solutions of parametric ordinary differential equation (ODE) systems are expressed as the unique solutions of corresponding ODE systems, under Carathéodory-style assumptions. Hence, the scope of numerical methods for nonsmooth equation-solving and local optimization is extended to systems with nonsmooth parametric ODEs embedded.


ACM Transactions on Mathematical Software | 2013

Evaluating an element of the Clarke generalized Jacobian of a composite piecewise differentiable function

Kamil A. Khan; Paul I. Barton

Bundle methods for nonsmooth optimization and semismooth Newton methods for nonsmooth equation solving both require computation of elements of the (Clarke) generalized Jacobian, which provides slope information for locally Lipschitz continuous functions. Since the generalized Jacobian does not obey sharp calculus rules, this computation can be difficult. In this article, methods are developed for evaluating generalized Jacobian elements for a nonsmooth function that is expressed as a finite composition of known elemental piecewise differentiable functions. In principle, these elemental functions can include any piecewise differentiable function whose analytical directional derivatives are known. The methods are fully automatable, and are shown to be computationally tractable relative to the cost of a function evaluation. An implementation developed in C++ is discussed, and the methods are applied to several example problems for illustration.


Optimization Methods & Software | 2018

Computationally relevant generalized derivatives: theory, evaluation and applications

Paul I. Barton; Kamil A. Khan; Peter G. Stechlinski; Harry A.J. Watson

A new method for evaluating generalized derivatives in nonsmooth problems is reviewed. Lexicographic directional (LD-)derivatives are a recently developed tool in nonsmooth analysis for evaluating generalized derivative elements in a tractable and robust way. Applicable to problems in both steady-state and dynamic settings, LD-derivatives exhibit a number of advantages over current theory and algorithms. As highlighted in this article, the LD-derivative approach now admits a suitable theory for inverse and implicit functions, nonsmooth dynamical systems and optimization problems, among others. Moreover, this technique includes an extension of the standard vector forward mode of automatic differentiation (AD) and acts as the natural extension of classical calculus results to the nonsmooth case in many ways. The theory of LD-derivatives is placed in the context of state-of-the-art methods in nonsmooth analysis, with an application in multistream heat exchanger modelling and design used to illustrate the usefulness of the approach.


Automatica | 2016

Generalized derivatives of dynamic systems with a linear program embedded

Kai Höffner; Kamil A. Khan; Paul I. Barton

Dynamic systems with a linear program (LP) embedded can be found in control and optimization of bioreactor models based on dynamic flux balance analysis (DFBA). Derivatives of the dynamic states with respect to a parameter vector are essential for open and closed-loop dynamic optimization and parameter estimation of such systems. These derivatives, given by a forward sensitivity system, may not exist because the optimal value of a linear program as a function of the right-hand side of the constraints is not continuously differentiable. Therefore, nonsmooth analysis must be applied which provides optimality conditions in terms of subgradients, for convex functions, or Clarkes generalized gradient, for nonconvex functions. This work presents an approach to compute the necessary information for nonsmooth optimization, i.e.,?an element of the generalized gradient. Moreover, a numerical implementation of the results is introduced. The approach is illustrated through a large-scale dynamic flux balance analysis example.


conference on decision and control | 2014

Generalized gradient elements for nonsmooth optimal control problems

Kamil A. Khan; Paul I. Barton

Recent advances in nonsmooth sensitivity analysis are extended to describe particular elements of Clarkes generalized gradient for the nonsmooth objective function of a nonsmooth optimal control problem, in terms of states of an auxiliary dynamic system. The considered optimal control problem is a generic nonlinear open-loop problem, in which the cost function and the right-hand side function describing the system dynamics may each be nonsmooth. The desired generalized gradient elements are obtained under two parametric discretizations of the control function: a representation as a linear combination of basis functions, and a piecewise constant representation. If the objective function under either discretization is convex, then the corresponding generalized gradient elements are subgradients, without requiring any convexity assumptions on the system dynamics.


Journal of Optimization Theory and Applications | 2018

Generalized Derivatives of Lexicographic Linear Programs

Jose A. Gomez; Kai Höffner; Kamil A. Khan; Paul I. Barton

Lexicographic linear programs are fixed-priority multiobjective linear programs that are a useful model of biological systems using flux balance analysis and for goal-programming problems. The objective function values of a lexicographic linear program as a function of its right-hand side are nonsmooth. This work derives generalized derivative information for lexicographic linear programs using lexicographic directional derivatives to obtain elements of the Bouligand subdifferential (limiting Jacobian). It is shown that elements of the limiting Jacobian can be obtained by solving related linear programs. A nonsmooth equation-solving problem is solved to illustrate the benefits of using elements of the limiting Jacobian of lexicographic linear programs.


Siam Journal on Optimization | 2018

Generalized Sensitivity Analysis of Nonlinear Programs

Peter G. Stechlinski; Kamil A. Khan; Paul I. Barton

This paper extends classical sensitivity results for nonlinear programs to cases in which parametric perturbations cause changes in the active set. This is accomplished using lexicographic directional derivatives, a recently developed tool in nonsmooth analysis based on Nesterovs lexicographic differentiation. A nonsmooth implicit function theorem is augmented with generalized derivative information and applied to a standard nonsmooth reformulation of the parametric KKT system. It is shown that the sufficient conditions for this implicit function theorem variant are implied by a KKT point satisfying the linear independence constraint qualification and strong second-order sufficiency. Mirroring the classical theory, the resulting sensitivity system is a nonsmooth equation system which admits primal and dual sensitivities as its unique solution. Practically implementable algorithms are provided for calculating the nonsmooth sensitivity systems unique solution, which is then used to furnish B-subdifferenti...


Systems & Control Letters | 2015

Switching behavior of solutions of ordinary differential equations with abs-factorable right-hand sides☆

Kamil A. Khan; Paul I. Barton

Abstract We consider nonsmooth dynamic systems that are formulated as the unique solutions of ordinary differential equations (ODEs) with right-hand side functions that are finite compositions of analytic functions and absolute-value functions. Various non-Zenoness results are obtained for such solutions: in particular, any absolute-value function in the ODE right-hand side can only switch between its two linear pieces finitely many times on any finite duration, even when a discontinuous control input is included. These results are extended to obtain numerically verifiable necessary conditions for the emergence of “valley-tracing modes”, in which the argument of an absolute-value function is identically zero for a nonzero duration. Such valley-tracing modes can create theoretical and numerical complications during sensitivity analysis or optimization. We show that any valley-tracing mode must begin either at the initial time, or when another absolute-value function switches between its two linear pieces.


Journal of Global Optimization | 2018

Corrections to: Differentiable McCormick relaxations

Kamil A. Khan; Matthew Wilhelm; Matthew D. Stuber; Huiyi Cao; Harry A.J. Watson; Paul I. Barton

These errata correct various errors in the closed-form relaxations provided by Khan, Watson, and Barton in the article “Differentiable McCormick Relaxations” (J Glob Optim, 67:687–729, 2017). Without these corrections, the provided closed-form relaxations may fail to be convex or concave and may fail to be valid relaxations.

Collaboration


Dive into the Kamil A. Khan's collaboration.

Top Co-Authors

Avatar

Paul I. Barton

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Harry A.J. Watson

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Kai Höffner

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Peter G. Stechlinski

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jeffrey Larson

Argonne National Laboratory

View shared research outputs
Top Co-Authors

Avatar

Jose A. Gomez

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Matthew D. Stuber

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Matthew Wilhelm

University of Connecticut

View shared research outputs
Top Co-Authors

Avatar

Stefan M. Wild

Argonne National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge