G. Emi-Reynolds
Ghana Atomic Energy Commission
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by G. Emi-Reynolds.
Radiation Protection Dosimetry | 2010
E. O. Darko; A. Faanu; A. R. Awudu; G. Emi-Reynolds; J. Yeboah; O. C. Oppon; E. H. K. Akaho
The results of studies carried out on public exposure contribution from naturally occurring radioactive materials (NORMS) in two open-pit mines in the Western and Ashanti regions of Ghana are reported. The studies were carried out under International Atomic Energy Agency-supported Technical Co-operation Project GHA/9/005. Measurements were made on samples of water, soil, ore, mine tailings and air using gamma spectrometry. Solid-state nuclear track detectors were used for radon concentration measurements. Survey was also carried out to determine the ambient gamma dose rate in the vicinity of the mines and surrounding areas. The effective doses due to external gamma irradiation, ingestion of water and inhalation of radon and ore dusts were calculated for the two mines. The average annual effective dose was found to be 0.30 +/- 0.06 mSv. The result was found to be within the levels published by other countries. The study provides a useful information and data for establishing a comprehensive framework to investigate other mines and develop guidelines for monitoring and control of NORMS in the mining industry and the environment as a whole in Ghana.
Radiation Protection Dosimetry | 2014
A. Faanu; H. Lawluvi; D. O. Kpeglo; E. O. Darko; G. Emi-Reynolds; A. R. Awudu; O. K. Adukpo; C. Kansaana; I. D. Ali; B. Agyeman; L. Agyeman; R. Kpodzro
Studies have been carried out within and around the operational area of the Chirano Gold Mine Ltd of Ghana to ascertain the baseline radioactivity levels of naturally occurring radioactive materials as well as artificial radionuclides in the surface and underground mines. The analysis was carried out by using gamma spectrometry to quantify the radionuclides of interest, namely (238)U, (232)Th, (137)Cs and (40)K in soil, ore, waste rock and water samples. The average activity concentrations of (238)U, (232)Th, (40)K and (137)Cs in the soil/rock samples were 9.79±5.39, 9.18±7.06, 237.40±144.34 and 0.64±0.57 Bq kg(-1), respectively. For the water samples, the average activity concentrations were 0.86±0.67, 0.97±1.33 and 9.05±10.45 Bq l(-1) for (226)Ra, (232)Th and (40)K, respectively. The total annual effective dose to the public was estimated to be 0.13 mSv, which is below the International Commission on Radiological Protection recommended level of 1 mSv for public exposure control. The study also assessed the elemental concentrations of U, Th and K in the soil/rock samples from the gold mine and surrounding communities. The average concentrations of the U, Th and K were 0.82±0.48, 2.18±1.77 µg g(-1) and 0.77±0.47 %, respectively. The concentrations of U, Th and K were variable in soil and rock samples taken from different locations in the study area with values varying in the range 0.28-2.21, 0.24-6.50 µg g(-1) and 0.28-1.87 %, respectively. The concentrations of U, Th and K are far lower than the world average values but comparable with the range of similar studies for different countries. The concentration values of gross-alpha and gross-beta for all the water samples were below the Ghana Standards Authority and World Health Organisation recommended guideline values for drinking water quality. The results obtained in this study also show that radiation levels are within the natural background radiation levels found in the literature and compare well with those of similar studies for other countries including Ghana.
Radiation Protection Dosimetry | 2013
D. Adjei; E. O. Darko; J. K. Annkah; J. K. Amoako; K. Ofori; G. Emi-Reynolds; M. K. Obeng; E. Akomaning-Adofo; P. Owusu-Manteaw
Analyses of the results of calibration of survey meters carried out at the Secondary Standards Dosimetry Laboratory (SSDL) in Ghana over a period of 4 y (2008-2011) are reported. The calibration factors (CFs) of the set of survey meters indicated that ∼91.04 % were within the acceptable limit of ± 20.0 %. A higher percentage of the survey meters indicated CFs in the range of 0.95-1.15 except a few of them which indicated values <0.55. Some of the survey meters also recorded CFs >1.15. The degree of uncertainty in the measurements ranged from 0.03 to 17 % with the majority of them ranging from 0.03 to 6.0 % and a few of them >6.0 %. The results show that most of the survey meters calibrated were within the requirements of the regulations and may provide data for future development of calibration techniques in the country.
Radiation Protection Dosimetry | 2012
F. Hasford; J. K. Amoako; E. O. Darko; G. Emi-Reynolds; E. K. Sosu; F. Otoo; G. O. Asiedu
The dose management system (DMS) is a computer software developed by the International Atomic Energy Agency for managing data on occupational exposure to radiation sources and intake of radionuclides. It is an integrated system for the user-friendly storage, processing and control of all existing internal and external dosimetry data. The Radiation Protection Board (RPB) of the Ghana Atomic Energy Commission has installed, customised, tested and using the DMS as a comprehensive DMS to improve personnel and area monitoring in the country. Personnel dose records from the RPBs database from 2000 to 2009 are grouped into medical, industrial and education/research sectors. The medical sector dominated the list of monitored institutions in the country over the 10-y period representing ∼87 %, while the industrial and education/research sectors represent ∼9 and ∼4 %, respectively. The number of monitored personnel in the same period follows a similar trend with medical, industrial and education/research sectors representing ∼74, ∼17 and ∼9 %, respectively. Analysis of dose data for 2009 showed that there was no instance of a dose above the annual dose limit of 20 mSv, however, 2.7 % of the exposed workers received individual annual doses >1 mSv. The highest recorded individual annual dose and total collective dose in all sectors were 4.73 mSv and 159.84 man Sv, respectively. Workers in the medical sector received higher individual doses than in the other two sectors, and average dose per exposed worker in all sectors is 0.25 mSv.
Radiation Protection Dosimetry | 2012
F. Hasford; J. Owusu-Banahene; J. K. Amoako; F. Otoo; E. O. Darko; G. Emi-Reynolds; J. Yeboah; C. C. Arwui; Simon Adu
Occupational exposure to radiation in medical practice in Ghana has been analysed for a 10-y period between 2000 and 2009. Monitored dose data in the medical institution in Ghana from the Radiation Protection Institutes database were extracted and analysed in terms of three categories: diagnostic radiology, radiotherapy and nuclear medicine. One hundred and eighty medical facilities were monitored for the 10-y period, out of which ~98% were diagnostic radiology facilities. Only one nuclear medicine and two radiotherapy facilities have been operational in the country since 2000. During the 10-y study period, monitored medical facilities increased by 18.8%, while the exposed workers decreased by 23.0%. Average exposed worker per entire medical institution for the 10-y study period was 4.3. Annual collective dose received by all the exposed workers reduced by a factor of 4 between 2000 and 2009. This is seen as reduction in annual collective doses in diagnostic radiology, radiotherapy and nuclear medicine facilities by ~76, ~72 and ~55%, respectively, for the 10-y period. Highest annual collective dose of 601.2 man mSv was recorded in 2002 and the least of 142.6 man mSv was recorded in 2009. Annual average values for dose per institution and dose per exposed worker decreased by 79 and 67.6%, respectively between 2000 and 2009. Average dose per exposed worker for the 10-y period was least in radiotherapy and highest in diagnostic radiology with values 0.14 and 1.05 mSv, respectively. Nuclear medicine however recorded average dose per worker of 0.72 mSv. Correspondingly, range of average effective doses within the diagnostic radiology, radiotherapy and nuclear medicine facilities were 0.328-2.614, 0.383-0.728 and 0.448-0.695 mSv, respectively. Throughout the study period, an average dose per medical institution of 3 mSv and an average dose per exposed worker of 0.69 mSv were realised. Exposed workers in diagnostic radiology primarily received most of the individual annual doses >1 mSv. The entire study period had 705 instances in which exposed workers received individual annual doses >1 mSv. On thermoluminescent dosemeter (TLD) return rates, facilities in Volta and Eastern Regions recorded highest return rates of 94.3% each. Ashanti Region recorded the least TLD return rate with 76.7%.
Journal of Medical Physics | 2012
Prince Kwabena Gyekye; G. Emi-Reynolds; Mary Boadu; E. O. Darko; Johnson Yeboah; Stephen Inkoom; Cynthia Kaikor Mensah
Cancer incidence estimates and dosimetry of 120 patients undergoing hysterosalpingography (HSG) without screening at five rural hospitals and with screening using image intensifier-TV at an urban hospital have been studied. Free in air kerma measurements were taken for patient dosimetry. Using PCXMC version 1.5, organ and effective doses to patients were estimated. Incidence of cancer of the ovary, colon, bladder and uterus due to radiation exposure were estimated using biological effects of ionising radiation committee VII excess relative risk models. The effective dose to patients was estimated to be 0.20 ± 0.03 mSv and 0.06 ± 0.01 mSv for procedures with and without screening, respectively. The average number of exposures for both procedures, 2.5, and screening time of 48.1 s were recorded. Screening time contributed majority of the patient doses due to HSG; therefore, it should be optimised as much as possible. Of all the cancers considered, the incidence of cancer of the bladder for patients undergoing HSG procedures is more probable.
Archive | 2011
Stephen Inkoom; Cyril Schandorf; G. Emi-Reynolds; J. J. Fletcher
The World Health Organization (WHO) defines a quality assurance (QA) programme in diagnostic radiology as an organized effort by the staff operating a facility to ensure that the diagnostic images produced are of sufficiently high quality so that they consistently provide adequate diagnostic information at the lowest possible cost and with the least possible exposure of the patient to radiation: (World Health Organization [WHO], 1982). The nature and extent of this programme will vary with the size and type of the facility, the type of examinations conducted, and other factors. The determination of what constitutes high quality in any QA programme will be made by the diagnostic radiology facility producing the images. The QA programme must cover the entire X-ray system from machine, to processor, to view box. Quality assurance actions include both quality control (QC) techniques and quality administration procedures. QC is normally part of the QA programme and quality control techniques are those techniques used in the monitoring (or testing) and maintenance of the technical elements or components of an X-ray system. The quality control techniques thus are concerned directly with the equipment that can affect the quality of the image i.e. the part of the QA programme that deals with instrumentation and equipment. An X-ray system refers to an assemblage of components for the controlled production of diagnostic images with X-rays. It includes minimally an X-ray high voltage generator, an X-ray control device, a tube-housing assembly, a beam-limiting device and the necessary supporting structures. Other components that function with the system, such as image receptors, image processors, automatic exposure control devices, view boxes and darkrooms, are also parts of the system. The main goal of a QC programme is to ensure the accuracy of the diagnosis or the intervention (optimising the outcome) while minimising the radiation dose to achieve that objective In a typical diagnostic radiology facility, QC procedures may include the following: a. Acceptance test and commissioning Acceptance test is performed on new equipment to demonstrate that it is performing within the manufacturer’s specifications and criteria (and also to confirm that the equipment meets
Health Physics | 2011
Mary Boadu; Cyril Schandorf; G. Emi-Reynolds; A. Faanu; Stephen Inkoom; Prince Kwabena Gyekye; Cynthia Kaikor Mensah
The International Basic Safety Standards requires that all personnel on whom protection and safety depends be trained and qualified. The Radiation Protection Institute of the Ghana Atomic Energy Commission has adopted a systematic approach to training those occupationally exposed to ionizing radiation in the course of their work. In collaboration with the International Atomic Energy Agency several training courses have been implemented at the national level and in the African region. From 1993 to 2008, more than 400 occupationally exposed workers in Ghana were trained on radiation safety. Several African regional training events on radiation safety have also been executed with a total participation number of 583 individuals. The training events have contributed towards upgrading the safety culture within institutions that have participated.
Journal of Radioanalytical and Nuclear Chemistry | 2015
O. K. Adukpo; A. Faanu; H. Lawluvi; L. Tettey-Larbi; G. Emi-Reynolds; E. O. Darko; C. Kansaana; D. O. Kpeglo; A. R. Awudu; E. T. Glover; P. A. Amoah; A. O. Efa; L. A. Agyemang; B. Agyeman; R. Kpordzro; A. I. Doe
1. He thought the minimum detectable activity (MDA) value for K is too low, the counting time was too short based on his experience and why distilled water was used. First of all the distilled water was used to limit the level of backscatter radiation within the detector system during the background count. Also, we want to state that a low background spectrometry system was used for the counting. In other to optimize the result of the background count and the real samples, the same counting time was used. We agree that the longer the counting time the greater the peak area, however, based on the prevailing laboratory condition which could not enable us to consider a longer counting time 36,000 was used which could also give an appreciable count rate. The MDA is defined as the smallest quantity of radioactivity that could be measured under specified conditions, and is another factor which is an important concept in environmental level measurement. The MDA depends on the lower limit of detection (LLD) and the counting efficiency of a counting system. The MDA is very important, particularly in environmental level systems, where the count rate of a sample is almost the same as the count rate of the background. The MDA was calculated using the conventional MDA method based on statistical convergence of 95 % confidence level. The value obtained for K was based on experimental data and calculation done using the conventional method.
Radiation Protection Dosimetry | 2016
Prince Kwabena Gyekye; Frank Becker; S. Y. Mensah; G. Emi-Reynolds
Studies have shown that there is high radiation exposure to medical staff during computed tomography fluoroscopy (CTF)-guided procedures. This study aims to investigate staff dose reduction techniques considering the CTF gantry positioning in the room and room dimensions in addition to the conventional use of thyroid collars, aprons and eye goggles. A Toshiba Aquilion One 640 slice CT scanner and CTF room were modelled using SimpleGeo. Standing and supine adult mesh phantoms were used to represent the staff and patient. The models were spatially put together on one platform using VOXEL2MCNP. Based on this, MCNPX input files were generated for the studies. CTF gantry and staff positions, and CTF room size were varied for different scenarios. Effective, eye lens and thyroid dose to staff were estimated for each scenario. Additional means of possible dose reduction with respect to positioning of the CTF device and room layout are discussed.