A. Sharif Heger
University of New Mexico
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by A. Sharif Heger.
Nuclear Science and Engineering | 1994
Keith E. Holbert; A. Sharif Heger; Nahrul K. Alang-Rashid
This research is motivated by the need to relax the strict boundary of numeric-based signal validation. To this end, the use of fuzzy logic for redundant sensor validation is introduced. Since signal validation employs both numbers and qualitative statements, fuzzy logic provides a pathway for transforming human abstractions into the numerical domain and thus coupling both sources of information. With this transformation, linguistically expressed analysis principles can be coded into a classification rule-base for signal failure detection and identification.
Annals of Nuclear Energy | 1996
A. Sharif Heger; Keith E. Holbert; A. Muneer Ishaque
Abstract A fuzzy logic instrument fault detection scheme is developed for systems having two or three redundant sensors. In the fuzzy logic approach the deviation between each signal pairing is computed and classified into three fuzzy sets. A rule base is created allowing the human perception of the situation to be represented mathematically. Fuzzy associative memories are then applied. Finally, a defuzzification scheme is used to find the centroid location, and hence the signal status. Real-time analyses are carried out to evaluate the instantaneous signal status as well as the long-term results for the sensor set. Instantaneous signal validation results are used to compute a best estimate for the measured state variable. The long-term sensor validation method uses a frequency fuzzy variable to determine the signal condition over a specific period. To corroborate the methodology synthetic data representing various anomalies are analyzed with both the fuzzy logic technique and the parity space approach.
Journal of Environmental Science and Health Part A-toxic\/hazardous Substances & Environmental Engineering | 1997
Hien N. Pham; Ebtisam Wilkins; A. Sharif Heger; David Kauffman
Abstract Spore‐forming Bacillus pumilus was used as a model for TiO2‐based photocatalysis. Previous results have shown that for different initial spore densities between 104 and 1010 CFU/ml, more inactivation of viable spores in aqueous TiO2 suspension occurred with an increase in spore density. The results were different from published results for different initial concentrations of organic pollutants and non‐spore‐forming organisms (e.g., E. coli). To determine a plausible explanation for the results obtained for the B. pumilus spores, a quantitative analysis has been performed based on the theory of probability. Since hydroxyl radicals (.OH) have been thought to be the primary species responsible for degrading/inactivating contaminants in water, a probabilistic approach will be used to determine quantitatively the likelihood that an interaction, or a collision, between a hydroxyl radical and a B. pumilus spore will occur, or that a hydroxyl radical interacting with a spore is viable. Once a mathematica...
Reliability Engineering & System Safety | 1997
A. Sharif Heger; Janis E. White
Abstract Decision-making under uncertainty describes most environmental remediation and waste management problems. Inherent limitations in knowledge concerning contaminants, environmental fate and transport, remedies, and risks force decision-makers to select a course of action based on uncertain and incomplete information. Because uncertainties can be reduced by collecting additional data., uncertainty and sensitivity analysis techniques have received considerable attention. When costs associated with reducing uncertainty are considered in a decision problem, the objective changes; rather than determine what data to collect to reduce overall uncertainty, the goal is to determine what data to collect to best differentiate between possible courses of action or decision alternatives. Environmental restoration and waste management requires cost-effective methods for characterization and monitoring, and these methods must also satisfy regulatory requirements. Characterization and monitoring activities imply that, sooner or later, a decision must be made about collecting new field data. Limited fiscal resources for data collection should be committed only to those data that have the most impact on the decision at lowest possible cost. Applying influence diagrams in combination with data worth analysis produces a method which not only satisfies these requirements but also gives rise to an intuitive representation of complex structures not possible in the more traditional decision tree representation. This paper demonstrates the use of influence diagrams in data worth analysis by applying to a monitor-and-treat problem often encountered in environmental decision problems.
Reliability Engineering & System Safety | 1995
A. Sharif Heger; Jayaram K. Bhat; Desmond W. Stack; Dale V. Talbott
In this paper a method for calculating top-event exact probability is presented. The method combines the ΣΠ algorithm of Corynen and the pattern recognition scheme of Koen et al. The PC-based program that is based on this method is called ΣΠ-Patrec and computes the exact probability of the top-event of a system fault tree model as defined by its cut sets. The ΣΠ module of the program partitions and disjoints the cut sets and solves the resultant sub-models recursively. The pattern recognition module of the code reduces the computational complexity by recognizing repeated sub-models in the calculation process and thus avoiding repeated evaluations. ΣΠ-Patrec can evaluate both coherent and incoherent fault trees. The input to ΣΠ-Patrec is a collection of cut sets in disjunctive normal form and need not be minimal. The description of the algorithm is presented through an example problem. The results of several experiments with large accident sequences are also presented.
Artificial Intelligence in Engineering | 1998
Hrishikesh B. Aradhye; A. Sharif Heger
Abstract An application of memory-based reasoning (MBR) for determining the relevancy of retrieved information, customized for the needs and preferences of an individual user, is presented. The use of MBR method in conjunction with a classification scheme such as k-nearest neighbors (k-NN) can set the foundation for an intelligent agent for search of patterns in databases. An intelligent agent is an adaptive database search system that can help reduce user information overload problem. To this end, a system called VINAYAK, † has been developed to investigate various issues concerned with information retrieval as an automated function. Current experiments have focused on chemical engineering literature search and safety-related retrieval through nuclear databases. The results point towards the advantage of the use of intelligent agent methodology in conjunction with a database search engine.
Nuclear Science and Engineering | 2005
Francisco J. Souto; Robert Kimpland; A. Sharif Heger
Abstract One of the primary methods to produce medical isotopes, such as 99Mo, is by irradiation of uranium targets in heterogeneous reactors. Solution reactors present a potential alternative to produce medical isotopes. The Medical Isotope Production Reactor (MIPR) concept has been proposed to produce medical isotopes with lower uranium consumption and waste than those in heterogeneous reactors. Commercial production of medical isotopes in solution reactors requires steady-state operation at ~200 kW. At this power regime, fuel-solution temperature increase and radiolytic-gas bubble formation introduce a negative reactivity feedback that has to be mitigated. A model based on the point reactor kinetic equations has been developed to investigate these reactivity effects. This model has been validated against experimental results from the Los Alamos National Laboratory uranyl fluoride Solution High-Energy Burst Assembly (SHEBA) and shows the feasibility of solution reactors for the commercial production of medical isotopes.
radiation effects data workshop | 2010
Keith E. Holbert; A. Sharif Heger; Steven S. McCready
Prompted by the unexpected failure of piezoresistive sensors in both an elevated gamma-ray environment and reactor core pulse tests, we initiated radiation testing of several MEMS piezoresistive accelerometers and pressure transducers to ascertain their radiation hardness. Some commercial off-the-shelf sensors are found to be viable options for use in a high-energy pulsed reactor, but others suffer severe degradation and even catastrophic failure. Although researchers are promoting the use of MEMS devices in radiation-harsh environment, we nevertheless find assurance testing necessary.
Nuclear Technology | 1997
Dan Glenn; A. Sharif Heger; William B. Hladik
Nearly all the {sup 99m}Tc administered to patients is obtained from eluting a radionuclide generator. The generators manufactured by the US radiopharmaceutical companies use only the high-specific activity molybdenum produced by the fission of uranium. The dominant production methods are those used by Cintichem, Inc. and Nordion International. There are, however, competing methods of the production of fission-based {sup 99}Mo. One of the most promising proposed alternatives is the use of solution reactors (or homogeneous reactors). The operational characteristics of conventional reactors (i.e., Cintichem process) and those of solution reactors to produce {sup 99}Mo for use in manufacturing {sup 99}Mo/{sup 99m}Tc generators are examined. The use of conventional reactors has the disadvantage of generating large amounts of radioactive waste. The use of solution reactors can significantly reduce this problem. Both methods require rigorous processing to meet the purity requirements due to the presence of fission product contamination.
Reliability Engineering & System Safety | 1993
A. Sharif Heger; Joe R. Hill
Abstract This paper proposes the use of probability networks to handle uncertainty in the performance assessment of high-level nuclear waste repositories. Probability networks provide coherent methods for: • describing uncertain expert opinion; • providing both qualitative and quantitative representations of an experts knowledge; • combining different sources of uncertainty including the uncertainty in scenarios, conceptual models, mathematical models, parameters, and data; • performing sensitivity analyses to determine influential components of the performance model; • making decisions regarding the data that should be collected to reduce uncertainty; • reconciling inconsistencies among experts; and • assessing the regulatory compliance of a particular repository. These features will be illustrated through the use of a simplified example in this paper.