Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sitakanta Mohanty is active.

Publication


Featured researches published by Sitakanta Mohanty.


Reliability Engineering & System Safety | 2006

Variable screening and ranking using sampling-based sensitivity measures

Y.-T. (Justin) Wu; Sitakanta Mohanty

This paper presents a methodology for screening insignificant random variables and ranking significant important random variables using sensitivity measures including two cumulative distribution function (CDF)-based and two mean-response based measures. The methodology features (1) using random samples to compute sensitivities and (2) using acceptance limits, derived from the test-of-hypothesis, to classify significant and insignificant random variables. Because no approximation is needed in either the form of the performance functions or the type of continuous distribution functions representing input variables, the sampling-based approach can handle highly nonlinear functions with non-normal variables. The main characteristics and effectiveness of the sampling-based sensitivity measures are investigated using both simple and complex examples. Because the number of samples needed does not depend on the number of variables, the methodology appears to be particularly suitable for problems with large, complex models that have large numbers of random variables but relatively few numbers of significant random variables.


Reliability Engineering & System Safety | 2001

Sensitivity analysis of a complex, proposed geologic waste disposal system using the Fourier Amplitude Sensitivity Test method

Yichi Lu; Sitakanta Mohanty

Abstract The Fourier Amplitude Sensitivity Test (FAST) method has been used to perform a sensitivity analysis of a computer model developed for conducting total system performance assessment of the proposed high-level nuclear waste repository at Yucca Mountain, Nevada, USA. The computer model has a large number of random input parameters with assigned probability density functions, which may or may not be uniform, for representing data uncertainty. The FAST method, which was previously applied to models with parameters represented by the uniform probability distribution function only, has been modified to be applied to models with nonuniform probability distribution functions. Using an example problem with a small input parameter set, several aspects of the FAST method, such as the effects of integer frequency sets and random phase shifts in the functional transformations, and the number of discrete sampling points (equivalent to the number of model executions) on the ranking of the input parameters have been investigated. Because the number of input parameters of the computer model under investigation is too large to be handled by the FAST method, less important input parameters were first screened out using the Morris method. The FAST method was then used to rank the remaining parameters. The validity of the parameter ranking by the FAST method was verified using the conditional complementary cumulative distribution function (CCDF) of the output. The CCDF results revealed that the introduction of random phase shifts into the functional transformations, proposed by previous investigators to disrupt the repetitiveness of search curves, does not necessarily improve the sensitivity analysis results because it destroys the orthogonality of the trigonometric functions, which is required for Fourier analysis.


Reliability Engineering & System Safety | 2001

CDF sensitivity analysis technique for ranking influential parameters in the performance assessment of the proposed high-level waste repository at Yucca Mountain, Nevada, USA

Sitakanta Mohanty; Y.-T. (Justin) Wu

Abstract A cumulative distribution function (CDF)-based method has been used to perform sensitivity analysis on a computer model that conducts total system performance assessment of the proposed high-level nuclear waste repository at Yucca Mountain, and to identify the most influential input parameters affecting the output of the model. The performance assessment computer model referred to as the TPA code, was recently developed by the US nuclear regulatory commission (NRC) and the center for nuclear waste regulatory analyses (CNWRA), to evaluate the performance assessments conducted by the US department of energy (DOE) in support of their license application. The model uses a probabilistic framework implemented through Monte Carlo or Latin hypercube sampling (LHS) to permit the propagation of uncertainties associated with model parameters, conceptual models, and future system states. The problem involves more than 246 uncertain parameters (also referred to as random variables) of which the ones that have significant influence on the response or the uncertainty of the response must be identified and ranked. The CDF-based approach identifies and ranks important parameters based on the sensitivity of the response CDF to the input parameter distributions. Based on a reliability sensitivity concept [AIAA Journal 32 (1994) 1717], the response CDF is defined as the integral of the joint probability-density-function of the input parameters, with a domain of integration that is defined by a subset of the samples. The sensitivity analysis does not require explicit knowledge of any specific relationship between the response and the input parameters, and the sensitivity is dependent upon the magnitude of the response. The method allows for calculating sensitivity over a wide range of the response and is not limited to the mean value.


Journal of Physics D | 1997

Effect of multiphase fluid saturation on the thermal conductivity of geologic media

Sitakanta Mohanty

This paper presents a method for calculating the effective stagnant thermal conductivity of consolidated porous media with multiphase fluid saturation. The method takes into account the pore-level heterogeneity in the rock and uses a realistic distribution of multiphase fluids in the pores. Unlike the relationships developed in the past, the proposed fluid morphology based method shows that the effective thermal conductivity bears a bilinear relation with fluid saturation, a trend observed in soil experiments. Explicit incorporation of pore-scale fluid morphology also shows a higher sensitivity of the effective thermal conductivity to intermediate multiphase saturations compared to an equivalent-medium representation of multiphase fluids in the pore space.


Nuclear Engineering and Design | 2000

An approach to the assessment of high-level radioactive waste containment — II: radionuclide releases from an engineered barrier system

Sitakanta Mohanty; Richard B. Codell; Tae M. Ahn; Gustavo A. Cragnolino

Abstract This paper is the second in a series describing the models used in the Engineered Barrier System Performance Assessment Code (EBSPAC) to represent processes that govern the failure of waste packages (WPs) and the release of radionuclides from the engineered barrier system (EBS). These models are specifically adapted to the US Department of Energy (DOE) WP design, adopted in 1996, for the proposed high-level radioactive waste (HLW) repository at Yucca Mountain (YM). The design consists of a double-wall overpack composed of two concentric containers of different metallic materials in a horizontal drift emplacement. EBSPAC was developed to deterministically evaluate the performance of the engineered barriers and to be used as the source term module in the Center for Nuclear Waste Regulatory Analyses (CNWRA)/Nuclear Regulatory Commission (NRC) Total-system Performance Assessment (TPA) code. EBSPAC has two distinct parts. The part dealing with the radionuclide release subsequent to WP failure is the focus of this paper in which various models (i.e. dry-air oxidation and aqueous dissolution of spent fuel (SF), gaseous and aqueous release of radionuclides) are presented, whereas modeling of the WP failure is described in a companion paper. An example problem is presented to illustrate computational results obtained with the code analyzing the influence of several critical input parameters for the source term related to the repository and EBS designs and resulting environmental conditions. The source term calculations are confined to the radionuclides being released just outside of the WP. Both gaseous and aqueous release calculations are performed using models in which radionuclide decay, in-growth of daughter products in the chains, degradation process of SF, temporal variation of inventory in the WP, and spatial variations in the properties of the surrounding material are included. The degree of complexity varies from model to model as necessary simplifications are made, while ensuring conservatism.


Nuclear Engineering and Design | 2000

An approach to the assessment of high-level radioactive waste containment. I: Waste package degradation

Gustavo A. Cragnolino; Sitakanta Mohanty; Darrell S. Dunn; Narasi Sridhar; Tae M. Ahn

A description is presented of the various models (i.e. thermal, chemical environment, humid-air and aqueous corrosion) used in the engineered barrier system performance assessment code (EBSPAC) to represent processes that govern the failure of waste packages (WPs) and, ultimately, the release of radionuclides from the engineered barrier system (EBS). These models are specifically adapted to the US Department of Energy (DOE) WP design, adopted in 1996, for the proposed repository at Yucca Mountain (YM), NV. The design consists of a double-wall overpack composed of two concentric containers of different metallic materials in a horizontal drift emplacement. EBSPAC was developed to deterministically evaluate the performance of the engineered barriers and to be used as the source term module incorporated in the Center for Nuclear Waste Regulatory Analyses (CNWRA)/Nuclear Regulatory Commission (NRC) total performance assessment (TPA) code. EBSPAC essentially consists of two separate codes. The code dealing with WP failure calculations is the focus of this paper and the other, dealing with radionuclide release, is described in a companion paper. An example problem is presented to illustrate the results obtained with the code analyzing the influence of several critical input parameters, related to the repository and EBS designs and the resulting environmental conditions, on WP failure.


Journal of Contaminant Hydrology | 1998

A test of long-term, predictive, geochemical transport modeling at the Akrotiri archaeological site

William M. Murphy; English C. Pearcy; Ronald T. Green; James D. Prikryl; Sitakanta Mohanty; Bret W. Leslie; Ashok Nedungadi

Abstract A study of elemental transport at the Akrotiri archeological site on the island of Santorini, Greece, has been conducted to evaluate the use of natural analog data in support of long-term predictive modeling of the performance of a proposed geologic repository for nuclear waste at Yucca Mountain, Nevada. Akrotiri and Yucca Mountain have many analogous features including silicic volcanic rocks, relatively dry climates, and oxidizing, hydrologically unsaturated subsurface conditions. Transport of trace elements from artifacts buried in volcanic ash 3600 years ago at Akrotiri is analogous to transport of radioactive wastes in the proposed repository. Subtle evidence for a plume of Cu, Zn, and Pb has been detected by selective leaching of packed earth and bedrock samples collected immediately beneath the site where bronze and lead artifacts were excavated. The geologic setting of the artifacts and the hydraulic properties of the enclosing media were characterized. A numerical model of the type used in repository performance assessments was developed for elemental transport at the site. Site characterization data were used to build the model but no prior information on the nature of the contaminant plume was provided to the modelers. Some model results are qualitatively consistent with field data, including the small amount of material transported, limited amounts of sorbed material, and relatively elevated sorption on a packed earth layer, However, discrepancies result from incomplete representation of heterogeneity and complexity and poorly constrained model parameters. Identification of such system characteristics and model limitations in relevant systems is a major contribution that analog studies can contribute in support of repository modeling.


Journal of Petroleum Science and Engineering | 1995

Sodium orthosilicate: An effective additive for alkaline steamflood

Sitakanta Mohanty; Santanu Khataniar

Steamflood performance often suffers from channelling and gravity segregation, resulting in poor sweep efficiency. Alkaline additives may be used with steam for certain types of crude oils to improve steamflood performance. In this paper, an experimental study is presented to evaluate the effectiveness of sodium hydroxide, sodium metasilicate and sodium orthosilicate for improving steamflood performance in the Wilmington Tar Zone crude oil. The experimental apparatus was designed to completely eliminate heat loss from the core so that the effects of the alkaline additives on saturated steam, rather than on the condensed water, could be studied. The results show that sodium orthosilicate outperforms sodium hydroxide and sodium metasilicate in enhancing oil recovery by a steamflood.


Nuclear Technology | 2004

Independent Postclosure Performance Estimates of the Proposed Repository at Yucca Mountain

Sitakanta Mohanty; Richard B. Codell

Abstract The key findings from a suite of independent analyses of the performance of the proposed repository at Yucca Mountain, conducted by the Center for Nuclear Waste Regulatory Analyses (CNWRA) and the U.S. Nuclear Regulatory Commission (NRC), are summarized. The analyses are geared toward obtaining risk insights from deterministic and probabilistic calculations of potential exposure to people in a down-gradient community, determining the capability of barriers to reduce flow of water and prevent or delay radionuclide transport, and identifying models, parameters, and subsystems that have the most influence on repository performance through the use of sensitivity and uncertainty analyses. The analyses have allowed the CNWRA and NRC to focus on the most critical aspects of estimating postclosure repository performance.


Archive | 2004

The Interpretation of Risk and Sensitivity Under the Peak-of-the-Mean Concept

Richard B. Codell; David W. Esh; Sitakanta Mohanty

In quantitative performance assessment (PA) for nuclear waste repositories, probabilistic (e.g., Monte Carlo) calculations are frequently used to estimate dose and risk [1], Each Monte Carlo realization represents the uncertain estimate of the future effect of the repository. There are at least two ways to interpret the model output; (1) take the peak doses from the Monte Carlo realizations and draw conclusions from their ensemble, e.g., the mean of the peak doses; and (2) at each instant of time, look at the ensemble of all realizations, and synthesize a representative dose-versus-time curve, e.g., the mean. Method 1 is easy to understand and explain. However, the calculation of the mean of the peak doses allows an additional degree of freedom that may inadvertently overestimate risk, because the peaks occur at different times and therefore the mean may include contributions from peaks outside of a single person’s life span. This dilemma has been discussed previously in connection with the definition of the critical group, e.g., Corbett [2]. The U.S. Nuclear Regulatory Commission (NRC) has adapted Method 2, taking the peak value of the mean curve to represent the dose that the Reasonably Maximally Exposed Individual (RMEI) could receive during the regulatory time period for the purpose of defining risk. We call this the “peak-of-the-mean (POM)” approach, and believe that it is the clearest and fairest definition of risk. However, calculations and sensitivity analyses with the POM must proceed thoughtfully, since there are computational pitfalls and results are sometimes counterintuitive.

Collaboration


Dive into the Sitakanta Mohanty's collaboration.

Top Co-Authors

Avatar

Budhi Sagar

Southwest Research Institute

View shared research outputs
Top Co-Authors

Avatar

Osvaldo Pensado

Southwest Research Institute

View shared research outputs
Top Co-Authors

Avatar

Richard B. Codell

Nuclear Regulatory Commission

View shared research outputs
Top Co-Authors

Avatar

Alan P. Morris

Southwest Research Institute

View shared research outputs
Top Co-Authors

Avatar

Alexander Y. Sun

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Bret W. Leslie

Nuclear Regulatory Commission

View shared research outputs
Top Co-Authors

Avatar

George Adams

Southwest Research Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert W. Rice

Southwest Research Institute

View shared research outputs
Top Co-Authors

Avatar

Tae M. Ahn

Nuclear Regulatory Commission

View shared research outputs
Researchain Logo
Decentralizing Knowledge