Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where James H. Bigelow is active.

Publication


Featured researches published by James H. Bigelow.


Mathematical Programming | 1974

Implicit function theorems for mathematical programming and for systems of inequalities

James H. Bigelow; Norman Shapiro

Implicit function formulas for differentiating the solutions of mathematical programming problems satisfying the conditions of the Kuhn—Tucker theorem are motivated and rigorously demonstrated. The special case of a convex objective function with linear constraints is also treated with emphasis on computational details. An example, an application to chemical equililibrium problems, is given.Implicit function formulas for differentiating the unique solution of a system of simultaneous inequalities are also derived.


Proceedings of SPIE | 1998

Introduction to multiresolution modeling (MMR) with an example involving precision fires

Paul K. Davis; James H. Bigelow

In this paper we review motivations for multilevel resolution modeling (MRM) within a single model, an integrated hierarchical family of models, or both. We then present a new depiction of consistency criteria for models at different levels. After describing our hypotheses for studying the process of MRM with examples, we define a simple but policy-relevant problem involving the use of precision fires to halt an invading army. We then illustrate MRM with a sequence of abstractions suggested by formal theory, visual representation, and approximation. We milk the example for insights about why MRM is different and often difficult, and how it might be accomplished more routinely. It should be feasible even in complex systems such as JWARS and JSIMS, but it is by no means easy. Comprehensive MRM designs are unlikely. It is useful to take the view that some MRM is a great deal better than none and that approximate MRM relationships are often quite adequate. Overall, we conclude that high-quality MRM requires new theory, design practices, modeling tools, and software tools, all of which will take some years to develop. Current object-oriented programming practices may actually be a hindrance.


Journal of Theoretical Biology | 1973

Systems analysis of the renal function

James H. Bigelow; J.C. Dehaven; M.L. Shapley

Abstract A previous model of the renal function and compartmented whole body, which incorporated actions of antidiuretic hormone on urine flow and composition, is extended to include the influences of additional phenomena, many exogenous to the kidney. The phenomena incorporated were selected by orderly trial from among those mechanisms suggested in the literature and found by operation of the model to be important for adequately representing renal responses to various stresses. Specifically, these phenomena are: the intrinsic-osmotic effect of water; details of the bodys antidiuretic hormone cycle; gastrointestinal exchanges; changes in glomerular filtration caused by alteration in blood volume and pressure; and resistance to flow across kidney tubular walls. Most of these phenomena involve non-steady-state, irreversible processes. The modeled renal function was found not to be sensitive to certain mechanisms sometimes associated with its operation. The most important is volume sensing, whereby extracellular volume changes are postulated to impose control on antidiuretic hormone production through stretch receptors in the circulatory system or through other volume sensors. Model investigations show that blood volume changes that alter blood pressure during stress can operate to affect kidney tubular-fluid flow in a manner which, when combined with other modeled kidney processes, yields correct urinary and other responses. No additional control on the antidiuretic hormone cycle through special volume receptors is required. The stresses to which the present model was exposed and responds correctly are: water loading through ingestion; hypertonic saline infusion; hypertonic urea solution ingestion; antidiuretic hormone dysfunction, as in diabetes insipidus; controlled rehydration after dehydration; and two combined stresses—hypertonic saline solution ingestion followed by water ingestion and the converse experiment. Most of these stresses involve transients in which the body becomes far removed from steady state. Different aspects of the total renal function, depending upon the specific stress, become more or less important in aiding the body to return to its original state.


Siam Journal on Applied Mathematics | 1970

Chemical Equilibrium Problems with Unbounded Constraint Sets

James H. Bigelow; James C. DeHaven; Norman Shapiro

Abstract : An investigation of the use of mathematical models to explore the chemical aspects of physiological systems; this deals with the theoretical and computational aspects of understanding the chemistry of human physiological function. The question of existence of solutions to problems having unbounded constraint sets is investigated by relating their existence (or nonexistence) to a property of a solution to an auxiliary chemical equilibrium problem with a bounded constraint set. An example system is selected consisting of gases in contact with an aqueous buffer solution at a uniform total hydrostatic pressure and temperature. The numerical problem of determining the amount of CO2 to be added to achieve a specified partial pressure of CO2 in the gas phase, and its effects on the composition of the total system, is solved by using a procedure suggested by the concept of unbounded constraint sets. Findings may apply to design of artificial life-support systems needed in extraterrestrial environments related to Air Force missions.


Enabling technology for simulation science. Conference | 2000

Case history of using entity-level simulation as imperfect experimental data for informing and calibrating simpler analytical models for interdiction

James H. Bigelow; Paul K. Davis; Jimmie McEver

We have used detailed, entity-level models to simulate the effects of long-range precision fires employed against an invader. Results show that these fires are much less effective against dispersed formations marching through mixed terrain than against dense formations in open terrain. We expected some loss of effectiveness, but not as much as observed. So we built a low resolution model (PEM, or PGM Effectiveness Model) and calibrated it to the results of the detailed simulation. PEM explains analytically how various situational and tactical factors, which are usually treated only in complex models, can influence the effectiveness of these fires. The variables we consider are characteristics of the C4ISR system (e.g., time of last update), missile and weapon characteristics (e.g., footprint), maneuver pattern of the advancing column (e.g., vehicle spacing), and aggregate terrain features (e.g., open versus mixed terrain).


winter simulation conference | 2000

Informing and calibrating a multiresolution exploratory analysis model with high resolution simulation: the interdiction problem as a case history

Paul K. Davis; James H. Bigelow; Jimmie McEver

Exploratory analysis uses a low-resolution model for broad survey work. High-resolution simulation can sometimes be used to inform development and calibration of such a model. The paper is a case history of such an effort. The problem at issue was characterizing the effectiveness, in interdicting an invading army, of long-range precision fires. After observing puzzling results from high-resolution simulation, we developed a multiresolution personal computer model called PEM to explain the phenomena analytically. We then studied the simulation data in depth to assess, adjust, and calibrate PEM, while at the same time discovering and accounting for various shortcomings or subtleties of the high-resolution simulation and data. The resulting PEM model clarified results and allowed us to explore a wide range of additional circumstances. It credibly predicted changes in effectiveness over two orders of magnitude, depending on situational factors involving C4ISR, maneuver patterns, missile and weapon characteristics, and type of terrain. The insights gained appear valid and a simplified version of PEM could be used for scaling adjustments in comprehensive theater-level models.


Enabling technology for simulation science. Conference | 2000

Implementing multiresolution models and families of models: from entity-level simulation to desktop stochastic models and "repro" models

Jimmie McEver; Paul K. Davis; James H. Bigelow

We have developed and used families of multiresolution and multiple-perspective models (MRM and MRMPM), both in our substantive analytic work for the Department of Defense and to learn more about how such models can be designed and implemented. This paper is a brief case history of our experience with a particular family of models addressing the use of precision fires in interdicting and halting an invading army. Our models were implemented as closed-form analytic solutions, in spreadsheets, and in the more sophisticated AnalyticaTM environment. We also drew on an entity-level simulation for data. The paper reviews the importance of certain key attributes of development environments (visual modeling, interactive languages, friendly use of array mathematics, facilities for experimental design and configuration control, statistical analysis tools, graphical visualization tools, interactive post-processing, and relational database tools). These can go a long way towards facilitating MRMPM work, but many of these attributes are not yet widely available (or available at all) in commercial model-development tools--especially for use with personal computers. We conclude with some lessons learned from our experience.


Enabling technologies for simulation science. Conference | 2002

Developing improved metamodels by combining phenomenological reasoning with statistical methods

James H. Bigelow; Paul K. Davis

A metamodel is relatively small, simple model that approximates the behavior of a large, complex model. A common and superficially attractive way to develop a metamodel is to generate data from a number of large-model runs and to then use off-the-shelf statistical methods without attempting to understand the models internal workings. This paper describes research illuminating why it is important and fruitful, in some problems, to improve the quality of such metamodels by using various types of phenomenological knowledge. The benefits are sometimes mathematically subtle, but strategically important, as when one is dealing with a system that could fail if any of several critical components fail. Naive metamodels may fail to reflect the individual criticality of such components and may therefore be quite misleading if used for policy analysis. Na*ve metamodeling may also give very misleading results on the relative importance of inputs, thereby skewing resource-allocation decisions. By inserting an appropriate dose of theory, however, such problems can be greatly mitigated. Our work is intended to be a contribution to the emerging understanding of multiresolution, multiperspective modeling (MRMPM), as well as a contribution to interdisciplinary work combining virtues of statistical methodology with virtues of more theory-based work. Although the analysis we present is based on a particular experiment with a particular large and complex model, we believe that the insights are more general.


Siam Journal on Applied Mathematics | 1977

A Scaling Theorem and Algorithms.

James H. Bigelow; Norman Shapiro

Motivated by the gravity model for trip table generation in transportation theory, we obtain a number of generalizations of a scaling theorem applicable to a problem arising from that model and describe algorithms arising from these theorems or their proofs.


Archive | 2017

User's Guide for the Total Force Blue-Line (TFBL) Model

Tara L. Terry; James H. Bigelow; James Pita; Jerry M. Sollinger; Paul Emslie

This users guide describes the nature of the manned pilot management problem, the analytic capabilities the U.S. Air Force has to cope with it, and how the Total Force Blue-Line model helps the Air Force manage all rated personnel across the Total Force. This guide contains instructions for running the model, including steps to follow, inputs to use, how the inputs can be generated, and a list of steps for generating red-line/blue-line charts.

Collaboration


Dive into the James H. Bigelow's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Paul K. Davis

Frederick S. Pardee RAND Graduate School

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge