Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sean P. Kenny is active.

Publication


Featured researches published by Sean P. Kenny.


16th AIAA Non-Deterministic Approaches Conference | 2014

The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

Luis G. Crespo; Sean P. Kenny; Daniel P. Giesy

NASA missions often involve the development of new vehicles and systems that must be designed to operate in harsh domains with a wide array of operating conditions. These missions involve high-consequence and safety-critical systems for which quantitative data is either very sparse or prohibitively expensive to collect. Limited heritage data may exist, but is also usually sparse and may not be directly applicable to the system of interest, making uncertainty quantification extremely challenging. NASA modeling and simulation standards require estimates of uncertainty and descriptions of any processes used to obtain these estimates. The NASA Langley Research Center has developed an uncertainty quantification challenge problem in an effort to focus a community of researchers towards a common problem. This challenge problem features key issues in both uncertainty quantification and robust design using a discipline-independent formulation. While the formulation is indeed discipline-independent, the underlying model, as well as the requirements imposed upon it, describes a realistic aeronautics application. A few high-level details of this application are provided at the end of this document. Additional information is available at: http://uqtools.larc.nasa.gov/nda-uq-challenge-problem-2014/.


Journal of Guidance Control and Dynamics | 2005

Reliability-Based Control Design for Uncertain Systems

Luis G. Crespo; Sean P. Kenny

This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an ecient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.


AIAA Guidance, Navigation, and Control Conference | 2012

Analysis of Control Strategies for Aircraft Flight Upset Recovery

Luis G. Crespo; Sean P. Kenny; David E. Cox; Daniel G. Muri

This paper proposes a framework for studying the ability of a control strategy, consisting of a control law and a command law, to recover an aircraft from ight conditions that may extend beyond the normal ight envelope. This study was carried out (i) by evaluating time responses of particular ight upsets, (ii) by evaluating local stability over an equilibrium manifold that included stall, and (iii) by bounding the set in the state space from where the vehicle can be safely own to wings-level ight. These states comprise what will be called the safely recoverable ight envelope (SRFE), which is a set containing the aircraft states from where a control strategy can safely stabilize the aircraft. By safe recovery it is implied that the tran- sient response stays between prescribed limits before converging to a steady horizontal ight. The calculation of the SRFE bounds yields the worst-case initial state corresponding to each control strategy. This information is used to compare alternative recovery strategies, determine their strengths and limitations, and identify the most e ective strategy. In regard to the control law, the authors developed feedback feedforward laws based on the gain scheduling of multivariable controllers. In regard to the command law, which is the mechanism governing the exogenous signals driving the feed- forward component of the controller, we developed laws with a feedback structure that combines local stability and transient response considera- tions. The upset recovery of the Generic Transport Model, a sub-scale twin-engine jet vehicle developed by NASA Langley Research Center, is used as a case study.


AIAA Guidance, Navigation and Control Conference and Exhibit | 2008

A Verification-driven Approach to Control Analysis and Tuning

Luis G. Crespo; Sean P. Kenny; Daniel P. Giesy

This paper proposes a methodology for the analysis and tuning of controllers using control verification metrics. These metrics, which are introduced in a companion paper, measure the size of the largest uncertainty set of a given class for which the closed-loop specifications are satisfied. This framework integrates deterministic and probabilistic uncertainty models into a setting that enables the deformation of sets in the parameter space, the control design space, and in the union of these two spaces. In regard to control analysis, we propose strategies that enable bounding regions of the design space where the specifications are satisfied by all the closed-loop systems associated with a prescribed uncertainty set. When this is unfeasible, we bound regions where the probability of satisfying the requirements exceeds a prescribed value. In regard to control tuning, we propose strategies for the improvement of the robust characteristics of a baseline controller. Some of these strategies use multi-point approximations to the control verification metrics in order to alleviate the numerical burden of solving a min-max problem. Since this methodology targets non-linear systems having an arbitrary, possibly implicit, functional dependency on the uncertain parameters and for which high-fidelity simulations are available, they are applicable to realistic engineering problems.


conference on decision and control | 2014

Interval predictor models with a formal characterization of uncertainty and reliability

Luis G. Crespo; Daniel P. Giesy; Sean P. Kenny

This paper develops techniques for constructing empirical predictor models based on observations. By contrast to standard models, which yield a single predicted output at each value of the models inputs, Interval Predictors Models (IPM) yield an interval into which the unobserved output is predicted to fall. The IPMs proposed prescribe the output as an interval valued function of the models inputs, render a formal description of both the uncertainty in the models parameters and of the spread in the predicted output. Uncertainty is prescribed as a hyper-rectangular set in the space of models parameters. The propagation of this set through the empirical model yields a range of outputs of minimal spread containing all (or, depending on the formulation, most) of the observations. Optimization-based strategies for calculating IPMs and eliminating the effects of outliers are proposed. Outliers are identified by evaluating the extent by which they degrade the tightness of the prediction. This evaluation can be carried out while the IPM is calculated. When the data satisfies mild stochastic assumptions, and the optimization program used for calculating the IPM is convex (or, when its solution coincides with the solution to an auxiliary convex program), the models reliability (that is, the probability that a future observation would be within the predicted range of outputs) can be bounded rigorously by a non-asymptotic formula.


AIAA Guidance, Navigation and Control Conference and Exhibit | 2008

Figures of Merit for Control Verification

Luis G. Crespo; Sean P. Kenny; Goesu. Daniel P.

This paper proposes a methodology for evaluating a controllers ability to satisfy a set of closed-loop specifications when the plant has an arbitrary functional dependency on uncertain parameters. Control verification metrics applicable to deterministic and probabilistic uncertainty models are proposed. These metrics, which result from sizing the largest uncertainty set of a given class for which the specifications are satisfied, enable systematic assessment of competing control alternatives regardless of the methods used to derive them. A particularly attractive feature of the tools derived is that their efficiency and accuracy do not depend on the robustness of the controller. This is in sharp contrast to Monte Carlo based methods where the number of simulations required to accurately approximate the failure probability grows exponentially with its closeness to zero. This framework allows for the integration of complex, high-fidelity simulations of the integrated system and only requires standard optimization algorithms for its implementation.


18th AIAA Non-Deterministic Approaches Conference | 2016

Application of Interval Predictor Models to Space Radiation Shielding

Luis G. Crespo; Sean P. Kenny; Daniel P. Giesy; Ryan B. Norman; Steve R. Blattnig

This paper develops techniques for predicting the uncertainty range of an output variable given input-output data. These models are called Interval Predictor Models (IPM) because they yield an interval valued function of the input. This paper develops IPMs having a radial basis structure. This structure enables the formal description of (i) the uncertainty in the models parameters, (ii) the predicted output interval, and (iii) the probability that a future observation would fall in such an interval. In contrast to other metamodeling techniques, this probabilistic certi cate of correctness does not require making any assumptions on the structure of the mechanism from which data are drawn. Optimization-based strategies for calculating IPMs having minimal spread while containing all the data are developed. Constraints for bounding the minimum interval spread over the continuum of inputs, regulating the IPMs variation/oscillation, and centering its spread about a target point, are used to prevent data over tting. Furthermore, we develop an approach for using expert opinion during extrapolation. This metamodeling technique is illustrated using a radiation shielding application for space exploration. In this application, we use IPMs to describe the error incurred in predicting the ux of particles resulting from the interaction between a high-energy incident beam and a target.


american control conference | 1992

Nonlinear Modeling of a Long Flexible Manipulator and Control by Inertial Devices

Enrique Barbieri; Sean P. Kenny; Raymond C. Montgomery

We consider the modeling and control of a planar, long, flexible manipulator that is representative of current space-based robotic arms such as the Space Shuttle Remote Manipulator System. The arm is equipped with three actuators: 1) a shoulder motor; 2) a torque wheel at the tip; and 3) a proof-mass actuator at the tip. The goal is to investigate the potential use of inertial devices as control inputs for maneuvering tasks and vibration suppression. The parameters used for the inertial devices at the tip are comparable to those specified for the Mini-Mast facility at the Langley Research Center. A nonlinear distributed parameter model is obtained by the extended Hamilton Principle. The associated eigenvalue/eigenfunction problem is solved and a finite-dimensional state space model is assembled. A preliminary design of a Linear Quadratic Regulator is used and computer simulations illustrate the benefits of using the proposed actuators.


50th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference | 2009

Sampling-based Strategies for the Estimation of Probabilistic Sensitivities

Luis G. Crespo; Sean P. Kenny; Daniel P. Giesy

This note presents the mathematical background and numerical evaluation of several approaches for calculating probabilistic sensitivities. In particular, we use the finite difference method and the Leibniz integral rule to derive two formulations for approximating derivatives of the mean, the variance, and the failure probability of a dependent variable with respect to means, and variances of the independent variables. These derivatives not only indicate the sensitivity to the uncertainty model assumed but also allow for the identification of the most dominant uncertain parameters. Deterministic sampling techniques that eliminate the random character of the numerical approximation are used to evaluate the resulting expressions. Examples admitting closed-form expressions for the sensitivities are used to validate the efficiency and accuracy of the approximations and to perform convergence analyses as a function of the discretization parameters. Remarks on the advantages and limitations of each method as well as our take on the best practices are also presented.


11th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference | 2006

Strict Constraint Feasibility in Analysis and Design of Uncertain Systems

Luis G. Crespo; Daniel P. Giesy; Sean P. Kenny

This paper proposes a methodology for the analysis and design optimization of models subject to parametric uncertainty, where hard inequality constraints are present. Hard constraints are those that must be satisfied for all parameter realizations prescribed by the uncertainty model. Emphasis is given to uncertainty models prescribed by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles. These models make it possible to consider sets of parameters having comparable as well as dissimilar levels of uncertainty. Two alternative formulations for hyper-rectangular sets are proposed, one based on a transformation of variables and another based on an infinity norm approach. The suite of tools developed enable us to determine if the satisfaction of hard constraints is feasible by identifying critical combinations of uncertain parameters. Since this practice is performed without sampling or partitioning the parameter space, the resulting assessments of robustness are analytically verifiable. Strategies that enable the comparison of the robustness of competing design alternatives, the approximation of the robust design space, and the systematic search for designs with improved robustness characteristics are also proposed. Since the problem formulation is generic and the solution methods only require standard optimization algorithms for their implementation, the tools developed are applicable to a broad range of problems in several disciplines.

Collaboration


Dive into the Sean P. Kenny's collaboration.

Top Co-Authors

Avatar

Luis G. Crespo

National Institute of Aerospace

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anuradha M. Annaswamy

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dave Ghosh

Langley Research Center

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Travis E. Gibson

Brigham and Women's Hospital

View shared research outputs
Researchain Logo
Decentralizing Knowledge