Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kendra L. Van Buren is active.

Publication


Featured researches published by Kendra L. Van Buren.


Wind Engineering | 2012

A Comparative Study: Predictive Modeling of Wind Turbine Blades

Kendra L. Van Buren; Sez Atamturktur

The structural analysis of wind turbine blades is completed using vastly different computational modeling strategies with varying levels of model sophistication and detail. Typically, preference of one modeling strategy over the other is decided according to subjective judgment of the expert. The central question that arises is how to justify the chosen level of sophistication and detail through quantitative, objective and scientifically defendable metrics. This manuscript takes a step toward answering this question and investigates the necessary level of sophistication and detail needed while modeling the cross-section of wind turbine blades by: i) rigorously quantifying the model incompleteness resulting from simplifying assumptions and ii) comparing the predictive maturity index associated with alternative modeling strategies. The concept of predictive maturity is illustrated on a prototype blade. The incompleteness of five alternative models with varying sophistication in the cross section of the shell elements are assessed through model form error and predictive maturity index. While model form error is observed as constant for varying levels of sophistication, through the predictive maturity index, it is found that models with lesser sophistication may have predictive capabilities comparable to more sophisticated, computationally expensive models.


Engineering Computations | 2015

Evaluating the fidelity and robustness of calibrated numerical model predictions: An application on a wind turbine blade

Garrison Stevens; Kendra L. Van Buren; Elizabeth Wheeler; Sez Atamturktur

Purpose – Numerical models are being increasingly relied upon to evaluate wind turbine performance by simulating phenomena that are infeasible to measure experimentally. These numerical models, however, require a large number of input parameters that often need to be calibrated against available experiments. Owing to the unavoidable scarcity of experiments and inherent uncertainties in measurements, this calibration process may yield non-unique solutions, i.e. multiple sets of parameters may reproduce the available experiments with similar fidelity. The purpose of this paper is to study the trade-off between fidelity to measurements and the robustness of this fidelity to uncertainty in calibrated input parameters. Design/methodology/approach – Here, fidelity is defined as the ability of the model to reproduce measurements and robustness is defined as the allowable variation in the input parameters with which the model maintains a predefined level of threshold fidelity. These two vital attributes of model ...


Archive | 2015

Mechanical Shock Environment Synthesis for Structural Failure Elicitation

Cassidy L. Fisher; Kaitlyn Kliewer; Gregory M. Naranjo; Stuart G. Taylor; Kendra L. Van Buren

Shock Response Spectra (SRS) are commonly used in dynamic testing to describe the mechanical environment in high-energy, non-stationary events, such as impacts or pyrotechnic shocks. Oftentimes, the service environment to which a structure will be exposed is difficult to reproduce in the laboratory, but design engineers desire a laboratory screening test to determine whether the structure will survive an anticipated shock environment. Herein, a combined experimental and numerical study is pursued to evaluate the efficacy of different methods to elicit failure modes of a service shock through destructive shaker tests of a custom-designed test article fabricated using commercially available 3-D printers. Design of the test article is explored through use of finite element modeling, which is found to correlate well to experimentally-obtained natural frequencies. Four techniques to synthesize a service shock are compared: least favorable response, sum of decaying sinusoids, wavelet, and matching temporal moments. Destructive shaker tests of the shock responses are performed using 25 nominally identical test articles to assess the ability of each method to impose similar damage states as those obtained when using the service shock. We find that the method of matching temporal moments best replicates failure modes of the service shock; however, further testing is needed to validate our observations.


Archive | 2014

Quantification of Prediction Bounds Caused by Model Form Uncertainty

Lindsey M. Gonzales; Thomas M. Hall; Kendra L. Van Buren; Steven R. Anton; François M. Hemez

Numerical simulations, irrespective of the discipline or application, are often plagued by arbitrary numerical and modeling choices. Arbitrary choices can originate from kinematic assumptions, for example the use of 1D beam, 2D shell, or 3D continuum elements, mesh discretization choices, boundary condition models, and the representation of contact and friction in the simulation. This works takes a step toward understanding the effect of arbitrary choices and model-form assumptions on the accuracy of numerical predictions. The application is the simulation of the first four resonant frequencies of a one-story aluminum portal frame structure under free-free boundary conditions. The main challenge of the portal frame structure resides in modeling joint connections, for which different modeling assumptions are available. To study this model-form uncertainty, and compare it to other types of uncertainty, two finite element models are developed using solid elements, and with differing representations of the beam-to-column and column-to-base plate connections: (1) contact stiffness coefficients or (2) tied nodes. Test-analysis correlation is performed by comparing the range of numerical predictions obtained from parametric studies of the joint modeling strategies to the range of experimentally obtained natural frequencies. The approach proposed is, first, to characterize the experimental variability of the joints by varying the bolt torque, method of bolt tightening, and the sequence in which the bolts are tightened. The second step is to convert what is learned from these experimental studies to models that bound the range of observed bolt behavior. We show that this approach, that combines small-scale experiments, sensitivity analysis studies, and bounding-case models, successfully produces bounds of numerical predictions that match those measured experimentally on the frame structure. (Approved for unlimited, public release, LA-UR-13-27561.)


Archive | 2017

A Case Study in Predictive Modeling Beyond the Calibration Domain

Philip Graybill; Eyob Tarekegn; Ian Tomkinson; Kendra L. Van Buren; François M. Hemez; Scott Cogan

While numerical modeling is an important tool in many areas of engineering, caution must be exercised when developing and applying these models. This is especially true when models are developed under calibration conditions, which is referred to herein as the calibration domain, and applied to predict (or forecast) outcomes under a different set of conditions, which is referred to as the forecasting domain. This work discusses a case study of predictive capability of a simple model away from its calibration domain. The application is to predict the payload that a quadcopter is able to lift. Model development is supported by two calibration experiments. The first experiment measures displacements obtained by attaching masses to various springs; it is used to develop a model that predicts displacement as a function of weight. The second experiment measures displacements resulting from spinning propeller blades of various dimensions; it is used to develop a model that predicts displacement as a function of blade diameter and revolutions-per-minute. Both models are combined to predict the payload that a quadcopter can lift, which represents an extrapolated forecast because conditions of the quadcopter differ from those under which the models are calibrated. Finally the quadcopter is tested experimentally to assess the predictive accuracy of the model. This application illustrates a preliminary thought process to ultimately determine how models developed in calibration domains perform in forecasting domains. (Approved for unlimited, public release, LA-UR-16-24484.)


Archive | 2016

Video Analysis in Multi-Intelligence

Kendra L. Van Buren; Will Warren; François M. Hemez

This is a project which was performed by a graduated high school student at Los Alamos National Laboratory (LANL). The goal of the Multi-intelligence (MINT) project is to determine the state of a facility from multiple data streams. The data streams are indirect observations. The researcher is using DARHT (Dual-Axis Radiographic Hydrodynamic Test Facility) as a proof of concept. In summary, videos from the DARHT facility contain a rich amount of information. Distribution of car activity can inform us about the state of the facility. Counting large vehicles shows promise as another feature for identifying the state of operations. Signal processing techniques are limited by the low resolution and compression of the videos. We are working on integrating these features with features obtained from other data streams to contribute to the MINT project. Future work can pursue other observations, such as when the gate is functioning or non-functioning.


Archive | 2016

Making Structural Condition Diagnostics Robust to Environmental Variability

Harry Edwards; Kyle Neal; Jack Reilly; Kendra L. Van Buren; François M. Hemez

Advances in sensor deployment and computational modeling have allowed significant strides to be made recently in the field of Structural Health Monitoring (SHM). One widely used SHM technique is to perform a vibration analysis where a model of the structure’s pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability and unknown values for model parameters. Not accounting for uncertainty in the analysis can lead to false-positives or false-negatives in the assessment of the structural condition. To manage the aforementioned uncertainty, we propose a robust-SHM methodology that combines three technologies. A time series algorithm is trained using “baseline” data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate “size” of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes in time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM.


Archive | 2016

Designing a Mechanical Latch for Robust Performance

François M. Hemez; Kendra L. Van Buren

Advances in computational sciences in the past three decades, such as those embodied by the finite element method, have made it possible to perform design and analysis using numerical simulations. While they offer undeniable benefits for rapid prototyping and can shorten the design-test-optimize cycle, numerical simulations also introduce assumptions and various sources of uncertainty. Examples are modeling assumptions proposed to represent a nonlinear material behavior, energy dissipation mechanisms and environmental conditions, in addition to numerical effects such as truncation error, mesh adaptation and artificial dissipation. Given these sources of uncertainty, what is the best way to support a design decision using simulations? We propose that an effective simulation-based design hinges on the ability to establish the robustness of its performance to assumptions and sources of uncertainty. Robustness means that exploring the uncertainty space that characterizes the simulation should not violate the performance requirement. The theory of information-gap (“info-gap”) for decision-making under severe uncertainty is applied to assess the robustness of two competing designs. The application is the dynamic stress performance of a mechanical latch for a consumer electronics product. The results are that the variant design only yields 10 % improvement in robustness to uncertainty while requiring 44 % more material for manufacturing. The analysis provides a rigorous rationale to decide that the variant design is not viable. (Approved for unlimited, public release, LA-UR-21296, unclassified.)


Archive | 2015

Robust-Optimal Design Using Multifidelity Models

Kendra L. Van Buren; François M. Hemez

Applications in engineering analysis and design have benefited from the use of numerical models to supplement or replace the costly design-build-test paradigm. Previous work has acknowledged that design optimization should not only consider the performance of the model, but also be as insensitive as possible, or robust, to sources of uncertainty that are used to define the simulation. Clearly, evaluating robustness to sources of uncertainty can be computationally expensive, due to the number of iterations required at every step of the optimization. Multifidelity techniques have been introduced to mitigate this computational expense by taking advantage of fast-running lower-fidelity models or emulators. Herein, to achieve robust design, we argue that it is more effective to reduce the total range of variation in model performance rather than to reduce the standard deviation of model performances due to uncertainty in calibration variables of the model. We utilize a multifidelity approach to apply this paradigm to a sub-problem of the NASA Uncertainty Quantification Challenge problem, which is a high-dimensional and nonlinear MATLAB-based code used to simulate dynamics of remotely operated aircraft developed at NASA Langley. This method demonstrates an alternative and computationally efficient approach to robust design.


53rd AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference<BR>20th AIAA/ASME/AHS Adaptive Structures Conference<BR>14th AIAA | 2012

Developing Simplified Models for Wind Turbine Blades

Kendra L. Van Buren; Mark Mollineaux; François M. Hemez; Darby J. Luscher

Simplified beam models provide a computationally efficient method for modeling the vibration of wind turbine blades. The purpose of this paper is to demonstrate the process of developing a simplified beam model of the CX-100 wind turbine blade, and quantifying its predictive capability. The motivation for this study is rooted in the development of NLBeam, a non-linear beam code developed at Los Alamos National Laboratory to simulate the structural dynamics response of wind turbines using the geometrically exact beam theory in a coupled atmospheric hydrodynamics solver. Verification activities used to assess the credibility of NLBeam are investigated. Two models of the CX-100 blade are compared: (1) a three dimensional shell model and (2) a simplified one- dimensional beam model. Two sets of experimental modal data are utilized, one with the CX-100 blade in a fixed-free condition, and one with the CX-100 blade in a fixed-free condition, with large masses applied. By exploring these different configurations of the wind turbine blade, credibility can be established regarding the ability of the FE model to predict the response to different loading conditions. Through the use of test-analysis correlation, the experimental data can be compared to model output and an assessment is given of the predictive capability of the model.

Collaboration


Dive into the Kendra L. Van Buren's collaboration.

Top Co-Authors

Avatar

François M. Hemez

Los Alamos National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kyle Neal

Vanderbilt University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Steven R. Anton

Tennessee Technological University

View shared research outputs
Top Co-Authors

Avatar

Harry Edwards

Atomic Weapons Establishment

View shared research outputs
Top Co-Authors

Avatar

Thomas M. Hall

Atomic Weapons Establishment

View shared research outputs
Top Co-Authors

Avatar

Scott Cogan

University of Franche-Comté

View shared research outputs
Researchain Logo
Decentralizing Knowledge