William V. Harper
Otterbein University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by William V. Harper.
The American Statistician | 2011
William V. Harper; Ted G. Eschenbach; Thomas R. James
Maximum likelihood estimation of the two-parameter Weibull distribution is straightforward; however, there are multiple methods for maximum likelihood estimation of the three-parameter Weibull. The third parameter for the three-parameter Weibull distribution shifts the origin from 0 to some generally positive value sometimes called the location, threshold, or minimum life. This article initially evaluates twelve statistical packages for four real-world datasets including oil spill data from the Gulf of Mexico. The different methods used by the packages result in fairly major differences in the estimated parameters between the statistical packages. Some statistical packages do not offer three-parameter Weibull estimation. Other software attempts the estimation in only certain cases. This may have major implications for those needing to estimate or apply the results of a three-parameter Weibull distribution that is used frequently in practice. A subset of the twelve packages is subsequently analyzed in detail based on an experimental design using pseudo-random Weibull datasets. This article also discusses the most common estimation method employed, which is maximizing a profile log-likelihood function.
2014 10th International Pipeline Conference | 2014
Tom Bubenik; William V. Harper; Pam Moreno; Steven J. Polasik
Pipeline operators around the world use in-line inspections and corrosion control systems to manage the integrity of their systems. Determining when to inspect is a critical consideration, which depends in part on whether corrosion growth takes place between inspections. Remaining life estimates based on estimated corrosion growth rates typically form the basis for reassessment intervals.Remaining life assessments often use assumptions about corrosion rates that, while conservative, can lead to unrealistic results. Excess conservatism leads to short reassessment intervals and unnecessary mitigation. This paper discusses how data analyses can be used to identify and verify areas where corrosion is actually taking place. By identifying and addressing these areas, operators can minimize unnecessary mitigation in low growth areas, ensure high growth areas are mitigated in a timely manner, and extend overall reassessment intervals.This paper discusses an integrated approach to identifying corrosion activity using a combination of statistics, inspection signal comparisons, and engineering analyses. The approach relies on a full understanding of the mechanisms that cause corrosion and its growth. Pipeline operators can use this approach to calculate remaining life, prioritize repairs and mitigation, and extend reassessment intervals. This process is collectively known as Statistically Active Corrosion (SAC) 1,2,3.Copyright
2012 9th International Pipeline Conference | 2012
Clifford J. Maier; Pamela J. Moreno; William V. Harper; David J. Stucki; Steven J. Polasik; Thomas A. Bubenik; David A. R. Shanks; Neil A. Bates
When it comes to managing the integrity of corroded pipelines, operators are confronted with many difficult decisions — one of which is the level of conservatism that is used in pipeline integrity assessments. The financial implications associated with excavation, repair, rehabilitation, and inspection programs typically balance the level of conservatism that is adopted. More conservative approaches translate into more spending, so it is important that repair strategies developed based on the integrity assessment results are effective.As integrity assessment methodologies continue to evolve, so does the ability to account for local conditions. One development in recent years has been the ability to evaluate multiple MFL in-line inspections to determine areas of active corrosion growth, through the combined use of statistics, inspection signal comparisons, and engineering analysis. The authors have previously outlined one approach (commonly known as Statistically Active Corrosion (SAC)) that has been successfully used to identify areas of probable corrosion growth, predict local corrosion growth rates, and maximize the effectiveness of integrity assessments.[1]Validation of the SAC-predicted corrosion growth rates is important for establishing confidence in the process. This is achieved through inspection signal comparisons, integrating close interval survey (CIS) results, and (when possible) field verification. The means by which these methods are used for validating the SAC method are described in this paper.Copyright
2012 9th International Pipeline Conference | 2012
William V. Harper; David J. Stucki; Thomas A. Bubenik; Clifford J. Maier; David A. R. Shanks; Neil A. Bates
The importance of comparing in-line inspectio n (ILI) calls to excavation data should not be underestimated. Ne ither should it be undertaken without a solid understanding of t he methodologies being employed. Such a comparison is not only a key part of assessing how well the tool performed , but also for an API 1163 evaluation and any subsequent use of the ILI data. The development of unity (1-1) plots and the associ ated regression analysis are commonly used to provide th e basis for predicting the likelihood of leaks or failures from unexcavated ILI calls. Combining such analysis with statistica lly active corrosion methods into perhaps a probability of exc eedance (POE) study helps develop an integrity maintenance plan for the years ahead. The theoretical underpinnings of standard reg ression analysis are based on the assumption that the indep endent variable (often thought of as x) is measured without error as a design variable. The dependent variable (often lab eled y) is modeled as having uncertainty or error. Pipeline c ompanies may run their regressions differently, but ILI to f ield excavation regressions often use the ILI depth as the x variable and field depth as the y variable. This is especially the case in which a probability of exceedance analysis is desired invol ving transforming ILI calls to predicted depths for a co mparison to a threshold of interest such as 80% wall thickness. However, in ILI to field depth regressions, both the measured d epths can have error. Thus, the underlying least squares reg ression assumptions are violated. Often one common result is a regression line that has a slope much less than the ideal 1-1 relationship.
Volume 2: Integrity Management; Poster Session; Student Paper Competition | 2006
Patrick H. Vieth; Clifford J. Maier; William V. Harper; Elden R. Johnson; Bhaskar Neogi; U. J. Baskurt; Alan Beckett
In-line inspection (ILI) of the Trans Alaska Pipeline System (TAPS) using high resolution metal loss tools indicated 77 locations with suspected minor mechanical damage features (MDF). The tools used are able to detect the presence of a suspected feature, and measure indented dimensions, but are insufficient to detect the presence of cracks or gouges needed to reliably assess feature severity based solely on the ILI data. Excavations of 42 sites deemed most severe provided important field data characterizing residual deformation dimensions, the occurrence of gouges or cracks, and allowing a reliable field assessment of defect severity. Upon completion of the excavations, 35 possible MDF locations remained unexcavated. An engineering evaluation was undertaken to assess whether or not these remaining minor MDF pose a threat that is significant enough to warrant excavation. Multiple assessment methods were utilized including deterministic, probabilistic, and risk assessment methods. The probabilistic assessment of 35 unexcavated MDFs was performed using PCFStat; or P ressure C ycle F atigue Stat istical Assessment, which uses Monte Carlo simulation to estimate remaining fatigue life. PCFStat performs 1,000’s of simulations for each case where the input parameters are randomly selected from expected distributions. Of particular importance is the fatigue environment of the location. The results of the probabilistic assessment were used to estimate the potential for failure of remaining MDFs. The results suggest that 25 of 35 unexpected damage features had a POF of less than 10−4 over the remaining expected pipeline life cycle and thus are unlikely to fail. Alyeska considered a combination of probabilistic, deterministic and risk assessment results to decide on the actual locations to be examined. The results of probabilistic analysis also were found to support the outcome of the operator’s risk-based evaluation process.Copyright
2014 10th International Pipeline Conference | 2014
Steven J. Polasik; Sean Keane; William V. Harper; Tom Bubenik
The ability for deterministic fracture mechanics assessments to correctly estimate the predicted burst pressure (PBP) is dependent upon the accuracy of the model as well as the assumptions made regarding mechanical properties and defect geometry. Within the assessment of defects identified through field investigation or through ILI programs, the use of extreme bounding values for all input variables can lead to unacceptably over-conservative predictions and impede the ability to achieve a target safety margin.This paper examines the effect multiple assumptions have on the bias between predicted and actual burst pressures for various kinds of defects that have caused in-service and hydrostatic pressure test failures in ERW and flash weld line pipe materials. The predicted failure pressures of defects documented within a recent U.S. Department of Transportation compendium of ERW and flash weld seam failures were analyzed using a variety of scenarios based on knowledge of key nominal parameters such as grade, outside diameter, wall thickness, and the maximum defect length and depth. The ratio of the predicted to actual failure pressure was statistically examined across the scenarios for the different defect types. Observations are made regarding the use of the PBP model in order to statistically quantify the accuracy of the model, which can be used as input for an operator to develop a process that achieves a target safety margin.Copyright
2014 10th International Pipeline Conference | 2014
Joseph P. Bratton; Mitch Glass; Edgar Ivan Cote; Andy Gallagher; William V. Harper
Current federal regulations in the U.S. require excavation of plain dents identified through in line inspection surveys based primarily on depth. Industry experience, and previous research, has shown that the depth of the dent, alone, is not sufficient to assess dent severity and that releases could occur at dents below the excavation threshold (Dawson, 2006). Canada’s National Energy Board released a safety advisory on June 18, 2010, to all companies under their jurisdiction regarding two incidents involving shallow dents. The safety advisory stated that all integrity management programs should be reviewed and updated where appropriate to address the threat posed by shallow dents. Similar incidents have raised awareness in the United States and elsewhere around the world.This paper focuses on an extensive multi-year effort to analyze the fitness for service of unconstrained shallow dents on multiple pipeline systems. Fatigue and strain analyses were performed to determine the serviceability and estimate the remaining service life. The dents in this study included both topside dents and bottomside dents that were previously evaluated through excavation to be unconstrained. Results of the fatigue and strain assessments are presented, along with field results of dents that were chosen for excavation.Comparison of the fitness for service results and subsequent excavation findings were performed to improve an ongoing campaign to prioritize several hundred in-service unconstrained dents. Maximum strain levels of the dents were calculated based on the geometry of the dent as determined by radial sensor measurements from in line inspection surveys. The results of the in-line inspection and field measurement comparisons were analyzed to determine the accuracy and possible adjustments of strain assessments for the ongoing fitness-for-service program.Copyright
Engineering Management Journal | 2013
Ted Eschenbach; William V. Harper
Abstract: Occurrence rates for rare events must often be estimated with limited data of unknown and changing applicability; thus, many risk models rely on average measures of risk even though more accurate possibilities may exist. A generalized approach is presented for selecting the best model of the rare event risk and for matching how the results are presented to decision-making needs. The process is illustrated using a case study of estimating oil spill rates from undersea pipelines that serve production platforms on the Outer Continental Shelf areas of the Gulf of Mexico. Existing models use average oil spill rates from these undersea pipelines, but many of these spills occur at a platform; thus, a new approach using a fixed/variable rate framework is suggested. We review the literature on modeling of Outer Continental Shelf oil spill rates, analyze historical oil spills for the estimated rates for both models, and contrast the results (that can differ by up to a factor of 2) for Arctic applications. Other quantitative applications for deeper, more distant wells in the Gulf of Mexico, air and spacecraft flights, maintenance of nuclear facilities, and failure of project activities are suggested. The proposed model is based on estimating average rates for the fixed and variable components. This model is placed on a continuum of risk models based on their data requirements, which provides a new conceptual framework for risk model “levels.” Implications for qualitative mental models of risk are also suggested; thus, there are many potential applications in risk models used by managers and operators.
2010 8th International Pipeline Conference, Volume 3 | 2010
William V. Harper; David J. Stucki; Taylor M. Shie; Ray J. Davies
Pipeline facilities are ageing and will likely soon come under closer scrutiny from federal regulation. It is imperative that sound reliability based inspection procedures be established that meet the goals of an organization while controlling time and cost. DNV Columbus has developed a statistically based sequential inspection decision support system for this purpose. This system was implemented for an international petroleum company and quickened the inspection process by making a “stop inspections” or a “continue inspections” decision after each inspection at a facility. This system allows inspections to be stopped because the desired reliability metrics have been met. This means that the point of diminishing returns has been met based on inspections that did not reveal a significant amount of corrosion. At this point, further sampling would provide minimal additional value to the reliability assessment. Inspections can also be stopped because the estimated reliability metrics have not been met. Stopping for this reason indicates the facility may need more significant repair or replacement. Engineers and managers can then make a decision that includes a variety of factors including safety and the economic feasibility of alternates. In contrast, when using this method, inspections continue because insufficient data have been collected to determine whether the reliability metrics have been met. This system will be illustrated with actual data. It will also describe the use of four key safety factors in developing site specific reliability goals. These factors are consequence, off site migration probability, product type, and facility size. This work can result in a major savings in time and financial expenditures for an inspection cycle. This reliability based inspection methodology leads to the following improvements: 1) Quicker decisions to save time and money, and allows more sites to be inspected in a timelier manner, 2) The reliability of a group of inspections performed is quantified after each inspection, 3) Results at a facility are broken down by database driven categories into a scorecard, 4) Methodology kept generic to be easily adapted to a wide variety of situations.Copyright
Transportation Research Record | 1991
William V. Harper; Kamran Majidzadeh