Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Patrick T. Marsh is active.

Publication


Featured researches published by Patrick T. Marsh.


Bulletin of the American Meteorological Society | 2012

An Overview of the 2010 Hazardous Weather Testbed Experimental Forecast Program Spring Experiment

Adam J. Clark; Steven J. Weiss; John S. Kain; Israel L. Jirak; Michael C. Coniglio; Christopher J. Melick; Christopher Siewert; Ryan A. Sobash; Patrick T. Marsh; Andrew R. Dean; Ming Xue; Fanyou Kong; Kevin W. Thomas; Yunheng Wang; Keith Brewster; Jidong Gao; Xuguang Wang; Jun Du; David R. Novak; Faye E. Barthold; Michael J. Bodner; Jason J. Levit; C. Bruce Entwistle; Tara Jensen; James Correia

The NOAA Hazardous Weather Testbed (HWT) conducts annual spring forecasting experiments organized by the Storm Prediction Center and National Severe Storms Laboratory to test and evaluate emerging scientific concepts and technologies for improved analysis and prediction of hazardous mesoscale weather. A primary goal is to accelerate the transfer of promising new scientific concepts and tools from research to operations through the use of intensive real-time experimental forecasting and evaluation activities conducted during the spring and early summer convective storm period. The 2010 NOAA/HWT Spring Forecasting Experiment (SE2010), conducted 17 May through 18 June, had a broad focus, with emphases on heavy rainfall and aviation weather, through collaboration with the Hydrometeorological Prediction Center (HPC) and the Aviation Weather Center (AWC), respectively. In addition, using the computing resources of the National Institute for Computational Sciences at the University of Tennessee, the Center for A...


Weather and Forecasting | 2013

Verification of Convection-Allowing WRF Model Forecasts of the Planetary Boundary Layer Using Sounding Observations

Michael C. Coniglio; James Correia; Patrick T. Marsh; Fanyou Kong

AbstractThis study evaluates forecasts of thermodynamic variables from five convection-allowing configurations of the Weather Research and Forecasting Model (WRF) with the Advanced Research core (WRF-ARW). The forecasts vary only in their planetary boundary layer (PBL) scheme, including three “local” schemes [Mellor–Yamada–Janjic (MYJ), quasi-normal scale elimination (QNSE), and Mellor–Yamada–Nakanishi–Niino (MYNN)] and two schemes that include “nonlocal” mixing [the asymmetric cloud model version 2 (ACM2) and the Yonei University (YSU) scheme]. The forecasts are compared to springtime radiosonde observations upstream from deep convection to gain a better understanding of the thermodynamic characteristics of these PBL schemes in this regime. The morning PBLs are all too cool and dry despite having little bias in PBL depth (except for YSU). In the evening, the local schemes produce shallower PBLs that are often too shallow and too moist compared to nonlocal schemes. However, MYNN is nearly unbiased in PBL ...


Bulletin of the American Meteorological Society | 2013

A Feasibility Study for Probabilistic Convection Initiation Forecasts Based on Explicit Numerical Guidance

John S. Kain; Michael C. Coniglio; James Correia; Adam J. Clark; Patrick T. Marsh; Conrad L. Ziegler; Valliappa Lakshmanan; Stuart D. Miller; Scott R. Dembek; Steven J. Weiss; Fanyou Kong; Ming Xue; Ryan A. Sobash; Andrew R. Dean; Israel L. Jirak; Christopher J. Melick

Abstract The 2011 Spring Forecasting Experiment in the NOAA Hazardous Weather Testbed (HWT) featured a significant component on convection32 initiation (CI). As in previous HWT experiments, the CI study was a collaborative effort between forecasters and researchers, with 34 equal emphasis on experimental forecasting strategies and evaluation of prototype model guidance products. The overarching goal of the CI effort was to identify the primary challenges 36 of the CI-forecasting problem and establish a framework for additional studies and possible routine forecasting of CI. This study confirms that convection-allowing models with grid spacing ~ 4 km38 represent many aspects of the formation and development of deep convection clouds explicitly and with predictive utility. Further, it shows that automated algorithms can 40 skillfully identify the CI process during model integration. However, it also reveals that automated detection of individual convection cells, by itself, provides inadequate guidance for


Weather and Forecasting | 2013

Tornado Pathlength Forecasts from 2010 to 2011 Using Ensemble Updraft Helicity

Adam J. Clark; Jidong Gao; Patrick T. Marsh; Travis M. Smith; John S. Kain; James Correia; Ming Xue; Fanyou Kong

AbstractExamining forecasts from the Storm Scale Ensemble Forecast (SSEF) system run by the Center for Analysis and Prediction of Storms for the 2010 NOAA/Hazardous Weather Testbed Spring Forecasting Experiment, recent research diagnosed a strong relationship between the cumulative pathlengths of simulated rotating storms (measured using a three-dimensional object identification algorithm applied to forecast updraft helicity) and the cumulative pathlengths of tornadoes. This paper updates those results by including data from the 2011 SSEF system, and illustrates forecast examples from three major 2011 tornado outbreaks—16 and 27 April, and 24 May—as well as two forecast failure cases from June 2010. Finally, analysis updraft helicity (UH) from 27 April 2011 is computed using a three-dimensional variational data assimilation system to obtain 1.25-km grid-spacing analyses at 5-min intervals and compared to forecast UH from individual SSEF members.


Weather and Forecasting | 2012

Forecasting Tornado Pathlengths Using a Three-Dimensional Object Identification Algorithm Applied to Convection-Allowing Forecasts

Adam J. Clark; John S. Kain; Patrick T. Marsh; James Correia; Ming Xue; Fanyou Kong

AbstractA three-dimensional (in space and time) object identification algorithm is applied to high-resolution forecasts of hourly maximum updraft helicity (UH)—a diagnostic that identifies simulated rotating storms—with the goal of diagnosing the relationship between forecast UH objects and observed tornado pathlengths. UH objects are contiguous swaths of UH exceeding a specified threshold. Including time allows tracks to span multiple hours and entire life cycles of simulated rotating storms. The object algorithm is applied to 3 yr of 36-h forecasts initialized daily from a 4-km grid-spacing version of the Weather Research and Forecasting Model (WRF) run in real time at the National Severe Storms Laboratory (NSSL), and forecasts from the Storm Scale Ensemble Forecast (SSEF) system run by the Center for Analysis and Prediction of Storms for the 2010 NOAA Hazardous Weather Testbed Spring Forecasting Experiment. Methods for visualizing UH object attributes are presented, and the relationship between pathlen...


Bulletin of the American Meteorological Society | 2014

Meteorological Overview of the Devastating 27 April 2011 Tornado Outbreak

Kevin R. Knupp; Todd A. Murphy; Timothy A. Coleman; Ryan Wade; Stephanie Mullins; Christopher J. Schultz; Elise V. Schultz; Lawrence D. Carey; Adam Sherrer; Eugene W. McCaul; Brian Carcione; Stephen Latimer; Andy Kula; Kevin Laws; Patrick T. Marsh; Kim Klockow

By many metrics, the tornado outbreak on 27 April 2011 was the most significant tornado outbreak since 1950, exceeding the super outbreak of 3–4 April 1974. The number of tornadoes over a 24-h period (midnight to midnight) was 199; the tornado fatalities and injuries were 316 and more than 2,700, respectively; and the insurable loss exceeded


Weather and Forecasting | 2015

Diagnosing the Conditional Probability of Tornado Damage Rating Using Environmental and Radar Attributes

Bryan T. Smith; R Ichard L. Thompson; R. Dean; Patrick T. Marsh

4 billion (U.S. dollars). In this paper, we provide a meteorological overview of this outbreak and illustrate that the event was composed of three mesoscale events: a large early morning quasi-linear convective system (QLCS), a midday QLCS, and numerous afternoon supercell storms. The main data sources include NWS and research radars, profilers, surface measurements, and photos and videos of the tornadoes. The primary motivation for this preliminary research is to document the diverse characteristics (e.g., tornado characteristics and mesoscale organization of deep convection) of this outbreak and summarize preliminary analyses that are worthy of additional research ...


Bulletin of the American Meteorological Society | 2012

Comments on “Tornado Risk Analysis: Is Dixie Alley an Extension of Tornado Alley?”

Patrick T. Marsh; Harold E. Brooks

Radar-identified convective modes, peak low-level rotational velocities, and near-storm environmental data were assignedtoa sampleoftornadoesreportedinthe contiguous UnitedStatesduring 2009‐13. Thetornadosegmentdata were filtered by the maximum enhanced Fujita (EF)-scale tornado event per hour using a 40-km horizontal grid. Convective mode was assigned to each tornado event by examining full volumetric Weather Surveillance Radar-1988 Doppler data at the beginning time of each event, and 0.58 peak rotational velocity (Vrot) data were identified manually during the life span of each tornado event. Environmental information accompanied each grid-hour event, consistingprimarily of supercell-related convectiveparameters fromthe hourly objective mesoscale analysescalculated and archived at the Storm Prediction Center. Results from examiningenvironmental and radar attributes, featuring the significant tornado parameter (STP) and 0.58 peak Vrot data, suggest an increasing conditional probability for greater EF-scale damage as both STP and 0.58 peak Vrot increase, especially with supercells. Possible applications of these findings include using the conditional probability of tornado intensity as a real-time situational awareness tool.


Monthly Weather Review | 2012

A Method for Calibrating Deterministic Forecasts of Rare Events

Patrick T. Marsh; John S. Kain; Valliappa Lakshmanan; Adam J. Clark; Nathan M. Hitchens; Jill Hardy

D ixon et al. (2011, hereafter DMCA11) present a tornado risk analysis that found parts of the southeast United States are the most tornado prone in the nation, instead of “Oklahoma, the state previously thought to be the maximum for tornado activity (Schaefer et al. 1986; Brooks et al. 2003).” Because both Brooks et al. (2003, hereafter BDK03) and DMCA11 employ kernel density estimation to achieve their depictions of tornado risk, a natural question that arises is why these analyses are so different. DMCA11 attempt to explain these differences as a consequence of their focus on tornado path length instead of tornado frequency. Although there is no question that the focus on slightly different underlying datasets affects the resulting analyses, the differences between the tornado path length and tornado frequency datasets are small enough that this explanation inadequately explains the differences between BDK03 and DMCA11. This comment offers a different explanation as to why the differences in the two different studies exist, one that focuses on the differences in how the kernel density estimation was conducted. In general, kernel density estimation is a nonparametric method of estimating the underlying probability density function (PDF) of a finite dataset. This is done by choosing a weighting function, or kernel, a measure of an area subject to the weighting function (this measure is often called the kernel bandwidth or simply bandwidth) and then applying the weighting function to the finite dataset over the prescribed area. A kernel can be any function K(u) that is nonnegative, real valued, integrable, and satisfies (Wilks 2006)


Bulletin of the American Meteorological Society | 2017

SHARPpy: An Open-Source Sounding Analysis Toolkit for the Atmospheric Sciences

William G. Blumberg; Kelton T. Halbert; Timothy A. Supinie; Patrick T. Marsh; Richard L. Thompson; John A. Hart

AbstractConvection-allowing models offer forecasters unique insight into convective hazards relative to numerical models using parameterized convection. However, methods to best characterize the uncertainty of guidance derived from convection-allowing models are still unrefined. This paper proposes a method of deriving calibrated probabilistic forecasts of rare events from deterministic forecasts by fitting a parametric kernel density function to the model’s historical spatial error characteristics. This kernel density function is then applied to individual forecast fields to produce probabilistic forecasts.

Collaboration


Dive into the Patrick T. Marsh's collaboration.

Top Co-Authors

Avatar

Adam J. Clark

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar

Fanyou Kong

University of Oklahoma

View shared research outputs
Top Co-Authors

Avatar

John S. Kain

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar

Ming Xue

University of Oklahoma

View shared research outputs
Top Co-Authors

Avatar

Andrew R. Dean

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar

Harold E. Brooks

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar

Israel L. Jirak

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar

Michael C. Coniglio

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar

Bryan T. Smith

National Oceanic and Atmospheric Administration

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge