Mark K. Transtrum
Brigham Young University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mark K. Transtrum.
Science | 2013
Benjamin B. Machta; Ricky Chachra; Mark K. Transtrum; James P. Sethna
Information Physics Multiparameter models, which can emerge in biology and other disciplines, are often sensitive to only a small number of parameters and robust to changes in the rest; approaches from information theory can be used to distinguish between the two parameter groups. In physics, on the other hand, one does not need to know the details at smaller length and time scales in order to understand the behavior on large scales. This hierarchy has been recognized for a long time and formalized within the renormalization group (RG) approach. Machta et al. (p. 604) explored the connection between two scales by using an information-theoretical approach based on the Fisher Information Matrix to analyze two commonly used physics models—diffusion in one dimension and the Ising model of magnetism—as the time and length scales, respectively, were progressively coarsened. The expected “stiff” parameters emerged, in agreement with RG intuition. An information-theoretical approach is used to distinguish the important parameters in two archetypical physics models. The microscopically complicated real world exhibits behavior that often yields to simple yet quantitatively accurate descriptions. Predictions are possible despite large uncertainties in microscopic parameters, both in physics and in multiparameter models in other areas of science. We connect the two by analyzing parameter sensitivities in a prototypical continuum theory (diffusion) and at a self-similar critical point (the Ising model). We trace the emergence of an effective theory for long-scale observables to a compression of the parameter space quantified by the eigenvalues of the Fisher Information Matrix. A similar compression appears ubiquitously in models taken from diverse areas of science, suggesting that the parameter space structure underlying effective continuum and universal theories in physics also permits predictive modeling more generally.
Physical Review Letters | 2010
Mark K. Transtrum; Benjamin B. Machta; James P. Sethna
Fitting model parameters to experimental data is a common yet often challenging task, especially if the model contains many parameters. Typically, algorithms get lost in regions of parameter space in which the model is unresponsive to changes in parameters, and one is left to make adjustments by hand. We explain this difficulty by interpreting the fitting process as a generalized interpolation procedure. By considering the manifold of all model predictions in data space, we find that cross sections have a hierarchy of widths and are typically very narrow. Algorithms become stuck as they move near the boundaries. We observe that the model manifold, in addition to being tightly bounded, has low extrinsic curvature, leading to the use of geodesics in the fitting process. We improve the convergence of the Levenberg-Marquardt algorithm by adding geodesic acceleration to the usual step.
Radiotherapy and Oncology | 2013
Tommy Sheu; Jessica M. Molkentine; Mark K. Transtrum; Thomas A. Buchholz; Hubert Rodney Withers; Howard D. Thames; Kathy A. Mason
PURPOSE To test the appropriateness of the linear-quadratic (LQ) model to describe survival of jejunal crypt clonogens after split doses with variable (small 1-6 Gy, large 8-13 Gy) first dose, as a model of its appropriateness for both small and large fraction sizes. METHODS C3Hf/KamLaw mice were exposed to whole body irradiation using 300 kVp X-rays at a dose rate of 1.84 Gy/min, and the number of viable jejunal crypts was determined using the microcolony assay. 14 Gy total dose was split into unequal first and second fractions separated by 4 h. Data were analyzed using the LQ model, the lethal potentially lethal (LPL) model, and a repair-saturation (RS) model. RESULTS Cell kill was greater in the group receiving the larger fraction first, creating an asymmetry in the plot of survival vs size of first dose, as opposed to the prediction of the LQ model of a symmetric response. There was a significant difference in the estimated βs (higher β after larger first doses), but no significant difference in the αs, when large doses were given first vs small doses first. This difference results in underestimation (based on present data by approximately 8%) of isoeffect doses using LQ model parameters based on small fraction sizes. While the LPL model also predicted a symmetric response inconsistent with the data, the RS model results were consistent with the observed asymmetry. CONCLUSION The LQ model underestimates doses for isoeffective crypt-cell survival with large fraction sizes (in the present setting, >9 Gy).
Methods | 2015
Mark K. Transtrum; Lee D. Hansen; Colette F. Quinn
The purposes of this paper are (a) to examine the effect of calorimeter time constant (τ) on heat rate data from a single enzyme injection into substrate in an isothermal titration calorimeter (ITC), (b) to provide information that can be used to predict the optimum experimental conditions for determining the rate constant (k2), Michaelis constant (KM), and enthalpy change of the reaction (ΔRH), and (c) to describe methods for evaluating these parameters. We find that KM, k2 and ΔRH can be accurately estimated without correcting for the calorimeter time constant, τ, if (k2E/KM), where E is the total active enzyme concentration, is between 0.1/τ and 1/τ and the reaction goes to at least 99% completion. If experimental conditions are outside this domain and no correction is made for τ, errors in the inferred parameters quickly become unreasonable. A method for fitting single-injection data to the Michaelis-Menten or Briggs-Haldane model to simultaneously evaluate KM, k2, ΔRH, and τ is described and validated with experimental data. All four of these parameters can be accurately inferred provided the reaction time constant (k2E/KM) is larger than 1/τ and the data include enzyme saturated conditions.
BMC Bioinformatics | 2012
Mark K. Transtrum; Peng Qiu
BackgroundParameter estimation in biological models is a common yet challenging problem. In this work we explore the problem for gene regulatory networks modeled by differential equations with unknown parameters, such as decay rates, reaction rates, Michaelis-Menten constants, and Hill coefficients. We explore the question to what extent parameters can be efficiently estimated by appropriate experimental selection.ResultsA minimization formulation is used to find the parameter values that best fit the experiment data. When the data is insufficient, the minimization problem often has many local minima that fit the data reasonably well. We show that selecting a new experiment based on the local Fisher Information of one local minimum generates additional data that allows one to successfully discriminate among the many local minima. The parameters can be estimated to high accuracy by iteratively performing minimization and experiment selection. We show that the experiment choices are roughly independent of which local minima is used to calculate the local Fisher Information.ConclusionsWe show that by an appropriate choice of experiments, one can, in principle, efficiently and accurately estimate all the parameters of gene regulatory network. In addition, we demonstrate that appropriate experiment selection can also allow one to restrict model predictions without constraining the parameters using many fewer experiments. We suggest that predicting model behaviors and inferring parameters represent two different approaches to model calibration with different requirements on data and experimental cost.
Biochimica et Biophysica Acta | 2016
Lee D. Hansen; Mark K. Transtrum; Colette F. Quinn; Neil A. Demarse
BACKGROUND Isothermal calorimetry allows monitoring of reaction rates via direct measurement of the rate of heat produced by the reaction. Calorimetry is one of very few techniques that can be used to measure rates without taking a derivative of the primary data. Because heat is a universal indicator of chemical reactions, calorimetry can be used to measure kinetics in opaque solutions, suspensions, and multiple phase systems and does not require chemical labeling. The only significant limitation of calorimetry for kinetic measurements is that the time constant of the reaction must be greater than the time constant of the calorimeter which can range from a few seconds to a few minutes. Calorimetry has the unique ability to provide both kinetic and thermodynamic data. SCOPE OF REVIEW This article describes the calorimetric methodology for determining reaction kinetics and reviews examples from recent literature that demonstrate applications of titration calorimetry to determine kinetics of enzyme-catalyzed and ligand binding reactions. MAJOR CONCLUSIONS A complete model for the temperature dependence of enzyme activity is presented. A previous method commonly used for blank corrections in determinations of equilibrium constants and enthalpy changes for binding reactions is shown to be subject to significant systematic error. GENERAL SIGNIFICANCE Methods for determination of the kinetics of enzyme-catalyzed reactions and for simultaneous determination of thermodynamics and kinetics of ligand binding reactions are reviewed.
Physical Review B | 2011
Mark K. Transtrum; Gianluigi Catelani; James P. Sethna
We study the superheating field of a bulk superconductor within Ginzburg-Landau theory, which is valid near the critical temperature. We calculate, as functions of the Ginzburg-Landau parameter
PLOS Computational Biology | 2016
Andrew White; Malachi Tolman; Howard D. Thames; Hubert Rodney Withers; Kathy A. Mason; Mark K. Transtrum
\ensuremath{\kappa}
PLOS Computational Biology | 2016
Mark K. Transtrum; Peng Qiu
, the superheating field
Archive | 2016
Brian K. Mannakee; Aaron P. Ragsdale; Mark K. Transtrum; Ryan N. Gutenkunst
{H}_{\mathrm{sh}}