Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sean McKenna is active.

Publication


Featured researches published by Sean McKenna.


IEEE Internet Computing | 2013

Scalable Anomaly Detection for Smart City Infrastructure Networks

Djellel Eddine Difallah; Philippe Cudré-Mauroux; Sean McKenna

Dynamically detecting anomalies can be difficult in very large-scale infrastructure networks. The authors approach addresses spatiotemporal anomaly detection in a smarter city context with large numbers of sensors deployed. They propose a scalable, hybrid Internet infrastructure for dynamically detecting potential anomalies in real time using stream processing. The infrastructure enables analytically inspecting and comparing anomalies globally using large-scale array processing. Deployed on a real pipe network topology of 1,891 nodes, this approach can effectively detect and characterize anomalies while minimizing the amount of data shared across the network.


Water Resources Research | 2013

On a recent solute transport laboratory experiment involving sandstone and its modeling

Timothy R. Ginn; Mohamed K. Nassar; Tamir Kamai; Katherine A. Klise; Vince Tidwell; Sean McKenna

[1]xa0We analyze and simulate laboratory data on flow and solute transport in a submeter scale sample of Massillon sandstone and we re-evaluate studies that have stated that these data indicate a failure of the advection-dispersion, and that nonlocal modeling approaches are necessary. Our examination reveals experimental issues including artificial edge effects in the data, as well as inconsistency in the measured solute injection rates. When the edge effects are removed the data no longer exhibit power-law tailing. Our simulations show that failure of the advection-dispersion equation has not been demonstrated and that nonlocal approaches are not necessary.


Seismological Research Letters | 2016

Using Wavelet Covariance Models for Simultaneous Picking of Overlapping P‐ and S‐Wave Arrival Times in Noisy Single‐Component Data

Alex Rinehart; Sean McKenna; Thomas A. Dewers

We present a method for automatically identifying overlapping elastice‐waveelastice‐wave phase arrivals in single‐component data. The algorithm applies to traditional near‐source seismic, microseismicity and picoseismicity monitoring, and acoustic emission monitoring; we use acoustic emissions examples as a worst‐case demonstration. These signals have low signal‐to‐noise and, because of small geometric dimensions, overlapping P ‐ and S ‐wave arrivals. Our method uses the statistics of temporal covariance across many wavelet scales. We use a nonnormalized rectilinity function of the scale covariance. The workflow begins by denoising signals and making a rough first‐arrival estimate. We then perform a continuous Daubechies wavelet transform over tens to hundreds of scales on the signal and find a moving covariance across transform scales. The nonnormalized rectilinity is calculated for each of the covariance matrices, and we sharpen changes in the rectilinity values with a maximization filter. We then estimate phase arrival times using thresholds of the filtered rectilinity. Overall, we have a high success rate for both P ‐ and S ‐wave arrivals. Remaining challenges include estimation of arrival times of long duration, cigar‐shape events, and culling complex high‐magnitude electrical noise. By using higher‐order Daubechies wavelet transforms, the scale covariance metric reflects variations in higher‐moment statistics (skewness and kurtosis) and changes in short‐term versus long‐term means, as well as the covariance across timescales of the signal. For single‐component data, it is necessary to preserve both amplitude and correlation information of the signal; this necessitates using the nonnormalized rectilinity function.


Archive | 2016

Skalierbar Anomalien erkennen für Smart City Infrastrukturen

Djellel Eddine Difallah; Philippe Cudré-Mauroux; Sean McKenna; Daniel Fasel

In diesem Kapitel wird ein Informationssystem beschrieben, welches Anomalien in grosen Netzwerken erkennen kann. Ein solches Netzwerk ist beispielsweise das Wasserversorgungsnetz einer Stadt. Anhand eines Prototyps wird aufgezeigt, wie potenzielle Anomalien dynamisch und in Echtzeit entdeckt werden konnen.


Ibm Journal of Research and Development | 2016

Temperature dynamics and water quality in distribution systems

Bradley J. Eck; Hirotaka Saito; Sean McKenna

Quality assurance strategies for water distribution systems often include the application of chemical disinfectants to limit the growth and transmission of pathogens. Characteristics of water quality in individual systems, and the type of disinfectant employed, create significant complexity in understanding and quantifying the impact of disinfectants in different networks. An additional challenge is that disinfection byproducts (DBPs), created through the breakdown of disinfectants, can be detrimental to human health. Therefore, it is necessary to maintain the correct level of disinfectant to control microbial pathogens while also limiting formation of DBPs to acceptable levels. Limiting formation of DBPs is an ongoing challenge for operators. We examined the impact of ground surface temperature and changes in network operations on disinfectant breakdown. As drinking-water utilities encourage customers to conserve water, water residence times within networks increase. In addition, as surface temperatures increase, heat transfer into the drinking water in subsurface pipes can also increase. Here, we review the literature on ways in which temperature affects disinfection rates and the production of DBPs and select one pathway for more detailed assessment. Numerical modeling is used to examine the changes in DBP production as a function of residence time, ground surface temperature, and heat transfer through the soil to the pipe.


oceans conference | 2014

Assessment and quantification of HF radar uncertainty

Fearghal O'Donncha; Sean McKenna; Teresa Updyke; Hugh Roarty; Emanuele Ragnoli

A large body of work exists concerning uncertainty in ocean current measuring high-frequency radar (HFR) systems. This study investigates the magnitude of uncertainty present in a HFR system in the lower Chesapeake Bay region of Virginia. A method of assessing the fundamental performance of the HFR is comparing the radial velocities measured by two facing HF radars at the centre point of their baseline. In an error-free network, radial vectors from the two sites would be equal and opposite at a point on the baseline, so the magnitude of their sum represents a measure of imperfection in the data. Often essential information lies not in any individual process variable but in how the variables change with respect to one another, i.e. how they co-vary. PCA is a data-driven modelling technique that transforms a set of correlated variables into a smaller set of uncorrelated variables while retaining most of the original information. This paper adopts PCA to detect anomalies in data coming from the individual HF stations. A PCA model is developed based on a calibration set of historical data. The model is used with new process data to detect changes in the system by application of PCA in combination with multivariate statistical techniques. Based on a comprehensive analysis the study presents an objective preconditioning methodology for preprocessing of HFR data prior to assimilation into coastal ocean models or other uses sensitive to the divergence of the flow.


international conference on pattern recognition | 2014

Bad Data Analysis with Sparse Sensors for Leak Localisation in Water Distribution Networks

Francesco Fusco; Bradley J. Eck; Sean McKenna

Traditional bad data detection and localisation, based on state estimation and residual analysis, produces misleading results, with high rates of false positives/negatives, in the case of strongly-correlated residuals arising from a low redundancy of sensors. By clustering the measurements according to the structure of the residuals covariance matrix, a method is proposed to extend bad data analysis to the localisation and estimation of anomalies at the coarser resolution of clusters rather than single measurements. The method is applied to the problem of water leak localisation and a realistic test-case, on the water distribution network of a major European City, is proposed.


allerton conference on communication, control, and computing | 2013

Time-variant regularization in affine projection algorithms

Amadou Ba; Sean McKenna

We propose a time-variant regularization in affine projection algorithms, where we update the regularization parameter with a gradient method using a momentum term parametrized by a momentum rate. To further improve the convergence properties of the algorithm in transient stages while ensuring a small final misadjustment, we adaptively estimate the momentum parameter. Then, we prove both the weak and strong convergence of the adaptive regularization. We apply the newly proposed algorithm to water quality data for prediction purposes, where we show that the developed algorithm outperforms existing time-varying regularization approaches.


Stochastic Environmental Research and Risk Assessment | 2018

Surrogate modeling and risk-based analysis for solute transport simulations

Ernesto Arandia; Fearghal O’Donncha; Sean McKenna; Seshu Tirupathi; Emanuele Ragnoli

AbstractnThis study is driven by the question of how quickly a solute will be flushed from an aquatic system after input of the solute into the system ceases. Simulating the fate and transport of a solute in an aquatic system can be performed at high spatial and temporal resolution using a computationally demanding state-of-the-art hydrodynamics simulator. However, uncertainties in the system often require stochastic treatment and risk-based analysis requires a large number of simulations rendering the use of a physical model impractical. A surrogate model that represents a second-level physical abstraction of the system is developed and coupled with a Monte Carlo based method to generate volumetric inflow scenarios. The surrogate model provides an approximate 8 orders of magnitude speed-up over the full physical model enabling uncertainty quantification through Monte Carlo simulation. The approach developed here consists of an stochastic inflow generator, a solute concentration prediction mechanism based on the surrogate model, and a system response risk assessment method. The probabilistic outcome provided relates the uncertain quantities to the relevant response in terms of the system’s ability to remove the solute. We develop a general approach that can be applied in a generality of system configurations and types of solute. As a test case, we present a study specific to salinization of a lake.


Archive | 2018

Statistical Parametric Mapping for Geoscience Applications

Sean McKenna

Spatial fields represent a common representation of continuous geoscience and environmental variables. Examples include permeability, porosity, mineral content, contaminant levels, seismic impedance, elevation, and reflectance/absorption in satellite imagery. Identifying differences between spatial fields is often of interest as those differences may represent key indicators of change. Defining a significant difference is often problem specific, but generally includes some measure of both the magnitude and the spatial extent of the difference. This chapter demonstrates a set of techniques available for the detection of anomalies in difference maps represented as multivariate spatial fields. The multiGaussian model is used as a model of spatially distributed error and several techniques based on the Euler characteristic are employed to define the significance of the number and size of excursion sets in the truncated multiGaussian field. This review draws heavily on developments made in the field of functional magnetic resonance imaging (fMRI) and applies them to several examples motivated by environmental and geoscience problems.

Collaboration


Dive into the Sean McKenna's collaboration.

Researchain Logo
Decentralizing Knowledge