Steven L. Simon
National Institutes of Health
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Steven L. Simon.
American Journal of Epidemiology | 2008
Gabriel Chodick; Nural Bekiroglu; Michael Hauptmann; Bruce H. Alexander; D. Michal Freedman; Michele M. Doody; Li C. Cheung; Steven L. Simon; Robert M. Weinstock; André Bouville; Alice J. Sigurdson
The study aim was to determine the risk of cataract among radiologic technologists with respect to occupational and nonoccupational exposures to ionizing radiation and to personal characteristics. A prospective cohort of 35,705 cataract-free US radiologic technologists aged 24-44 years was followed for nearly 20 years (1983-2004) by using two follow-up questionnaires. During the study period, 2,382 cataracts and 647 cataract extractions were reported. Cigarette smoking for >or=5 pack-years; body mass index of >or=25 kg/m(2); and history of diabetes, hypertension, hypercholesterolemia, or arthritis at baseline were significantly (p <or= 0.05) associated with increased risk of cataract. In multivariate models, self-report of >or=3 x-rays to the face/neck was associated with a hazard ratio of cataract of 1.25 (95% confidence interval: 1.06, 1.47). For workers in the highest category (mean, 60 mGy) versus lowest category (mean, 5 mGy) of occupational dose to the lens of the eye, the adjusted hazard ratio of cataract was 1.18 (95% confidence interval: 0.99, 1.40). Findings challenge the National Council on Radiation Protection and International Commission on Radiological Protection assumptions that the lowest cumulative ionizing radiation dose to the lens of the eye that can produce a progressive cataract is approximately 2 Gy, and they support the hypothesis that the lowest cataractogenic dose in humans is substantially less than previously thought.
Health Physics | 2008
Kwang Pyo Kim; Donald L. Miller; Stephen Balter; Ruth A. Kleinerman; Martha S. Linet; Deukwoo Kwon; Steven L. Simon
Cardiac catheterization procedures using fluoroscopy reduce patient morbidity and mortality compared to operative procedures. These diagnostic and therapeutic procedures require radiation exposure to patients and physicians. The objectives of the present investigation were to provide a systematic comprehensive summary of the reported radiation doses received by operators due to diagnostic or interventional fluoroscopically-guided procedures, to identify the primary factors influencing operator radiation dose, and to evaluate whether there have been temporal changes in the radiation doses received by operators performing these procedures. Using PubMed, we identified all English-language journal articles and other published data reporting radiation exposures to operators from diagnostic or interventional fluoroscopically-guided cardiovascular procedures from the early 1970s through the present. We abstracted the reported radiation doses, dose measurement methods, fluoroscopy system used, operational features, radiation protection features, and other relevant data. We calculated effective doses to operators in each study to facilitate comparisons. The effective doses ranged from 0.02–38.0 &mgr;Sv for DC (diagnostic catheterizations), 0.17–31.2 &mgr;Sv for PCI (percutaneous coronary interventions), 0.24–9.6 &mgr;Sv for ablations, and 0.29–17.4 &mgr;Sv for pacemaker or intracardiac defibrillator implantations. The ratios of doses between various anatomic sites and the thyroid, measured over protective shields, were 0.9 ± 1.0 for the eye, 1.0 ± 1.5 for the trunk, and 1.3 ± 2.0 for the hand. Generally, radiation dose is higher on the left side of an operators body, because the operators left side is closer to the primary beam when standing at the patients right side. Modest operator dose reductions over time were observed for DC and ablation, primarily due to reduction in patient doses due to decreased fluoroscopy/cineradiography time and dose rate by technology improvement. Doses were not reduced over time for PCI. The increased complexity of medical procedures appears to have offset dose reductions due to improvements in technology. The large variation in operator doses observed for the same type of procedure suggests that optimizing procedure protocols and implementing general use of the most effective types of protective devices and shields may reduce occupational radiation doses to operators. We had considerable difficulty in comparing reported dosimetry results because of significant differences in dosimetric methods used in each study and multiple factors influencing the actual doses received. Better standardization of dosimetric methods will facilitate future analyses aimed at determining how well medical radiation workers are being protected.
Journal of Radiological Protection | 2009
Jolyon H Hendry; Steven L. Simon; Andrzej Wojcik; Mehdi Sohrabi; Werner Burkart; Elisabeth Cardis; D. Laurier; Margot Tirmarche; Isamu Hayata
Natural radiation is the major source of human exposure to ionising radiation, and its largest contributing component to effective dose arises from inhalation of (222)Rn and its radioactive progeny. However, despite extensive knowledge of radiation risks gained through epidemiologic investigations and mechanistic considerations, the health effects of chronic low-level radiation exposure are still poorly understood. The present paper reviews the possible contribution of studies of populations living in high natural background radiation (HNBR) areas (Guarapari, Brazil; Kerala, India; Ramsar, Iran; Yangjiang, China), including radon-prone areas, to low dose risk estimation. Much of the direct information about risk related to HNBR comes from case-control studies of radon and lung cancer, which provide convincing evidence of an association between long-term protracted radiation exposures in the general population and disease incidence. The success of these studies is mainly due to the careful organ dose reconstruction (with relatively high doses to the lung), and to the fact that large-scale collaborative studies have been conducted to maximise the statistical power and to ensure the systematic collection of information on potential confounding factors. In contrast, studies in other (non-radon) HNBR areas have provided little information, relying mainly on ecological designs and very rough effective dose categorisations. Recent steps taken in China and India to establish cohorts for follow-up and to conduct nested case-control studies may provide useful information about risks in the future, provided that careful organ dose reconstruction is possible and information is collected on potential confounding factors.
Radiation Research | 2010
Martha S. Linet; Kwang Pyo Kim; Donald L. Miller; Ruth A. Kleinerman; Steven L. Simon; Amy Berrington de Gonzalez
Abstract Epidemiological studies of medical radiation workers have found excess risks of leukemia, skin and female breast cancer in those employed before 1950 but little consistent evidence of cancer risk increases subsequently. Occupational radiation-related dose–response data and recent and lifetime cancer risk data are limited for radiologists and radiologic technologists and lacking for physicians and technologists performing fluoroscopically guided procedures. Survey data demonstrate that occupational doses to radiologists and radiologic technologists have declined over time. Eighty mostly small studies of cardiologists and fewer studies of other physicians reveal that effective doses to physicians per interventional procedure vary by more than an order of magnitude. For medical radiation workers, there is an urgent need to expand the limited information on average annual, time-trend and organ doses from occupational radiation exposures and to assess lifetime cancer risks of these workers. For physicians and technologists performing interventional procedures, more information about occupational doses should be collected and long-term follow-up studies of cancer and other serious disease risks should be initiated. Such studies will help optimize standardized protocols for radiologic procedures, determine whether current radiation protection measures for medical radiation workers are adequate, provide guidance on cancer screening needs, and yield valuable insights on cancer risks associated with chronic radiation exposure.
Health Physics | 2012
Kwang Pyo Kim; Donald L. Miller; Amy Berrington de Gonzalez; Stephen Balter; Ruth A. Kleinerman; Evgenia Ostroumova; Steven L. Simon; Martha S. Linet
AbstractIn the past 30 y, the numbers and types of fluoroscopically-guided (FG) procedures have increased dramatically. The objective of the present study is to provide estimated radiation doses to physician specialists, other than cardiologists, who perform FG procedures. The authors searched Medline to identify English-language journal articles reporting radiation exposures to these physicians. They then identified several primarily therapeutic FG procedures that met specific criteria: well-defined procedures for which there were at least five published reports of estimated radiation doses to the operator, procedures performed frequently in current medical practice, and inclusion of physicians from multiple medical specialties. These procedures were percutaneous nephrolithotomy (PCNL), vertebroplasty, orthopedic extremity nailing for treatment of fractures, biliary tract procedures, transjugular intrahepatic portosystemic shunt creation (TIPS), head/neck endovascular therapeutic procedures, and endoscopic retrograde cholangiopancreatography (ERCP). Radiation doses and other associated data were abstracted, and effective dose to operators was estimated. Operators received estimated doses per patient procedure equivalent to doses received by interventional cardiologists. The estimated effective dose per case ranged from 1.7–56 &mgr;Sv for PCNL, 0.1–101 &mgr;Sv for vertebroplasty, 2.5–88 &mgr;Sv for orthopedic extremity nailing, 2.0–46 &mgr;Sv for biliary tract procedures, 2.5–74 &mgr;Sv for TIPS, 1.8–53 &mgr;Sv for head/neck endovascular therapeutic procedures, and 0.2–49 &mgr;Sv for ERCP. Overall, mean operator radiation dose per case measured over personal protective devices at different anatomic sites on the head and body ranged from 19–800 (median = 113) &mgr;Sv at eye level, 6–1,180 (median = 75) &mgr;Sv at the neck, and 2–1,600 (median = 302) &mgr;Sv at the trunk. Operators’ hands often received greater doses than the eyes, neck, or trunk. Large variations in operator doses suggest that optimizing procedure protocols and proper use of protective devices and shields might reduce occupational radiation dose substantially.
Health Physics | 1998
Steven L. Simon
Ingestion of soil by humans has been a documented phenomenon for centuries and still takes place today according to various literature. The literature reviewed here shows that there are two distinct soil ingestion phenomenon: inadvertent and purposeful (geophagia). Certain lifestyles, occupations, and living conditions will likely put different individuals or different groups at risk to these separate, but sometimes related, phenomenon. In particular, reports of geophagia are relatively common for the life stages of adolescence and periods of growth, and during pregnancy and lactation. Geophagia also appears to be relatively common among indigenous peoples on all continents, sometimes taking place to extreme degrees. Because of their high dependence on the land, indigenous peoples are also at highest risk for inadvertent ingestion. Inadvertent intake is more a function of either primitive living conditions or professions that may bring workers into close and continual contact with the soil. It is the purpose of this report to review and summarize literature related to ingestion of soil by humans with emphasis on the relevance of soil ingestion to radiological dose assessment, the etiology of geophagia and its relationship to risk assessment, qualitative observations and quantitative studies of direct soil ingestion by humans with interpretations useful for different lifestyle scenarios, the status of a number of current radiological assessment models in accounting for soil ingestion, and some unresolved issues in modeling the ingestion of soil.
Medical Physics | 2011
Choonsik Lee; Kwang Pyo Kim; D Long; R Fisher; Chris Tien; Steven L. Simon; André Bouville; Wesley E. Bolch
PURPOSE To develop a computed tomography (CT) organ dose estimation method designed to readily provide organ doses in a reference adult male and female for different scan ranges to investigate the degree to which existing commercial programs can reasonably match organ doses defined in these more anatomically realistic adult hybrid phantoms METHODS The x-ray fan beam in the SOMATOM Sensation 16 multidetector CT scanner was simulated within the Monte Carlo radiation transport code MCNPX2.6. The simulated CT scanner model was validated through comparison with experimentally measured lateral free-in-air dose profiles and computed tomography dose index (CTDI) values. The reference adult male and female hybrid phantoms were coupled with the established CT scanner model following arm removal to simulate clinical head and other body region scans. A set of organ dose matrices were calculated for a series of consecutive axial scans ranging from the top of the head to the bottom of the phantoms with a beam thickness of 10 mm and the tube potentials of 80, 100, and 120 kVp. The organ doses for head, chest, and abdomen/pelvis examinations were calculated based on the organ dose matrices and compared to those obtained from two commercial programs, CT-EXPO and CTDOSIMETRY. Organ dose calculations were repeated for an adult stylized phantom by using the same simulation method used for the adult hybrid phantom. RESULTS Comparisons of both lateral free-in-air dose profiles and CTDI values through experimental measurement with the Monte Carlo simulations showed good agreement to within 9%. Organ doses for head, chest, and abdomen/pelvis scans reported in the commercial programs exceeded those from the Monte Carlo calculations in both the hybrid and stylized phantoms in this study, sometimes by orders of magnitude. CONCLUSIONS The organ dose estimation method and dose matrices established in this study readily provides organ doses for a reference adult male and female for different CT scan ranges and technical parameters. Organ doses from existing commercial programs do not reasonably match organ doses calculated for the hybrid phantoms due to differences in phantom anatomy, as well as differences in organ dose scaling parameters. The organ dose matrices developed in this study will be extended to cover different technical parameters, CT scanner models, and various age groups.
International Journal of Cancer | 2008
Parveen Bhatti; Jeffery P. Struewing; Bruce H. Alexander; Michael Hauptmann; Laura Bowen; Lutecia H. Mateus-Pereira; Marbin Pineda; Steven L. Simon; Robert M. Weinstock; Marvin Rosenstein; Marilyn Stovall; Dale L. Preston; Martha S. Linet; Michele M. Doody; Alice J. Sigurdson
High‐dose ionizing radiation exposure to the breast and rare autosomal dominant genes have been linked with increased breast cancer risk, but the role of low‐to‐moderate doses from protracted radiation exposure in breast cancer risk and its potential modification by polymorphisms in DNA repair genes has not been previously investigated among large numbers of radiation‐exposed women with detailed exposure data. Using carefully reconstructed estimates of cumulative breast doses from occupational and personal diagnostic ionizing radiation, we investigated the potential modification of radiation‐related breast cancer risk by 55 candidate single nucleotide polymorphisms in 17 genes involved in base excision or DNA double‐strand break repair among 859 cases and 1083 controls from the United States Radiologic Technologists (USRT) cohort. In multivariable analyses, WRN V114I (rs2230009) significantly modified the association between cumulative occupational breast dose and risk of breast cancer (adjusted for personal diagnostic exposure) (p = 0.04) and BRCA1 D652N (rs4986850), PRKDC IVS15 + 6C > T (rs1231202), PRKDC IVS34 + 39T > C (rs8178097) and PRKDC IVS31 − 634C > A (rs10109984) significantly altered the personal diagnostic radiation exposure‐response relationship (adjusted for occupational dose) (p ≤ 0.05). None of the remaining 50 SNPs significantly modified breast cancer radiation dose‐response relationships. The USRT genetic study provided a unique opportunity to examine the joint effects of common genetic variation and ionizing radiation exposure on breast cancer risk using detailed occupational and personal diagnostic exposure data. The suggestive evidence found for modification of radiation‐related breast cancer risk for 5 of the 55 SNPs evaluated requires confirmation in larger studies of women with quantified radiation breast doses in the low‐to‐moderate range.
Epidemiology | 2006
Joseph L. Lyon; Stephen C. Alder; Mary Bishop Stone; Alan Scholl; James C. Reading; Richard Holubkov; Xiaoming Sheng; George L. White; Kurt T. Hegmann; Lynn R. Anspaugh; F. Owen Hoffman; Steven L. Simon; Brian A. Thomas; Raymond J. Carroll; A. Wayne Meikle
Background: A study was begun in 1965 to 1966 to determine whether children exposed to radioactive iodine from nuclear weapons testing at the Nevada Test Site from 1951 through 1962 were at higher risk of thyroid disease. In 1993, we reported that among those examined in 1985 to 1986 (Phase II) there was an association between radiation from the Nevada Test Site and thyroid neoplasms. Methods: We reevaluated the relationship between exposure to Nevada Test Site fallout and thyroid disease using newly corrected dose estimates and disease outcomes from the Phase II study. A prospective cohort of school children 12 to 18 years old living in Utah, Nevada, and Arizona was first examined for thyroid disease in 1965 to 1966 and reexamined in 1985 to 1986. In the Phase II report, 2497 subjects formed the basis for this analysis. Thyroid disease, including thyroid neoplasms and thyroiditis, was expressed as cumulative incidence and risk ratios (RRs) with a dose–response expressed as excess risk ratio (ERR/Gy). Results: The RR between thyroid radiation dose in the highest dose group and thyroid neoplasms increased from 3.4 (in the earlier analysis) to 7.5. The RR for thyroiditis increased from 1.1 to 2.7 with an ERR/Gy of 4.9 (95% confidence interval = 2.0 to 10.0). There were too few malignant thyroid neoplasms to estimate risk. Conclusions: Persons exposed to radioactive iodine as children have an increased risk of thyroid neoplasms and autoimmune thyroiditis up to 30 years after exposure.
Radiation Research | 2006
Steven L. Simon; Robert M. Weinstock; Michele M. Doody; James W. Neton; Thurman B. Wenzl; Patricia A. Stewart; Aparna K. Mohan; R. Craig Yoder; Michael Hauptmann; D. Michal Freedman; John Cardarelli; H. Amy Feng; André Bouville; Martha S. Linet
Abstract Simon, S. L., Weinstock, R. M., Doody, M. M., Neton, J., Wenzel, T., Stewart, P., Mohan, A. K., Yoder, C., Freedman, M., Hauptmann, M., Bouville, A., Cardarelli, J., Feng, H. A. and Linet, M. Estimating Historical Radiation Doses to a Cohort of U.S. Radiologic Technologists. Radiat. Res. 166, 174– 192 (2006). Data have been collected and physical and statistical models have been constructed to estimate unknown occupational radiation doses among 90,000 members of the U.S. Radiologic Technologists cohort who responded to a baseline questionnaire during the mid-1980s. Since the availability of radiation dose data differed by calendar period, different models were developed and applied for years worked before 1960, 1960– 1976 and 1977–1984. The dose estimation used available film-badge measurements (approximately 350,000) for individual cohort members, information provided by the technologists on their work history and protection practices, and measurement and other data derived from the literature. The dosimetry model estimates annual and cumulative occupational badge doses (personal dose equivalent) for each technologist for each year worked from 1916 through 1984 as well as absorbed doses to organs and tissues including bone marrow, female breast, thyroid, ovary, testes, lung and skin. Assumptions have been made about critical variables including average energy of X rays, use of protective aprons, position of film badges, and minimum detectable doses. Uncertainty of badge and organ doses was characterized for each year of each technologists working career. Monte Carlo methods were used to generate estimates of cumulative organ doses for preliminary cancer risk analyses. The models and predictions presented here, while continuing to be modified and improved, represent one of the most comprehensive dose reconstructions undertaken to date for a large cohort of medical radiation workers.