In this article, we describe recent methodological enhancements and findings from the dose reconstruction component of a study of health risks among U.S. radiologic technologists. An earlier version of the dosimetry published in 2006 used physical and statistical models, literature-reported exposure measurements for the years before 1960, and archival personnel monitoring badge data from cohort members through 1984. The data and models previously described were used to estimate annual occupational radiation doses for 90,000 radiological technologists, incorporating information about each individual's employment practices based on a baseline survey conducted in the mid-1980s. The dosimetry methods presented here, while using many of the same methods as before, now estimate 2.23 million annual badge doses (personal dose equivalent) for the years 1916–1997 for 110,374 technologists, but with numerous methodological improvements. Every technologist's annual dose is estimated as a probability density function to reflect uncertainty about the true dose. Multiple realizations of the entire cohort distribution were derived to account for shared uncertainties and possible biases in the input data and assumptions used. Major improvements in the dosimetry methods from the earlier version include: A substantial increase in the number of cohort member annual badge dose measurements; Additional information on individual apron usage obtained from surveys conducted in the mid-1990s and mid-2000s; Refined modeling to develop lognormal annual badge dose probability density functions using censored data regression models; Refinements of cohort-based annual badge probability density functions to reflect individual work patterns and practices reported on questionnaires and to more accurately assess minimum detection limits; and Extensive refinements in organ dose conversion coefficients to account for uncertainties in radiographic machine settings for the radiographic techniques employed. For organ dose estimation, we rely on well-researched assumptions about critical exposure-related variables and their changes over the decades, including the peak kilovoltage and filtration typically used in conducting radiographic examinations, and the usual body location for wearing radiation monitoring badges, the latter based on both literature and national recommendations. We have derived organ dose conversion coefficients based on air-kerma weighting of photon fluences from published X-ray spectra and derived energy-dependent transmission factors for protective lead aprons of different thicknesses. Findings are presented on estimated organ doses for 12 organs and tissues: red bone marrow, female breast, thyroid, brain, lung, heart, colon, ovary, testes, skin of trunk, skin of head and neck and arms, and lens of the eye.
Radiation Organ Doses Received in a Nationwide Cohort of U.S. Radiologic Technologists: Methods and Findings
- Views Icon Views
- Share Icon Share
- Search Site
Steven L. Simon, Dale L. Preston, Martha S. Linet, Jeremy S. Miller, Alice J. Sigurdson, Bruce H. Alexander, Deukwoo Kwon, R. Craig Yoder, Parveen Bhatti, Mark P. Little, Preetha Rajaraman, Dunstana Melo, Vladimir Drozdovitch, Robert M. Weinstock, Michele M. Doody; Radiation Organ Doses Received in a Nationwide Cohort of U.S. Radiologic Technologists: Methods and Findings. Radiat Res 1 November 2014; 182 (5): 507–528. doi: https://doi.org/10.1667/RR13542.1
Download citation file: