Skip Nav Destination
Close Modal
Update search
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
Filter
- Title
- Author
- Author Affiliations
- Full Text
- Abstract
- Keyword
- DOI
- ISBN
- EISBN
- ISSN
- EISSN
- Issue
- Volume
- References
NARROW
Format
Journal
Article Type
Date
Availability
1-20 of 45
John D. Boice,
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Journal:
Radiation Research
Radiation Research (2019) 191 (4): 297–310.
Published: 21 February 2019
Abstract
Retrospective radiation dose estimations, whether based on physical or biological measurements, or on theoretical dose reconstruction, are limited in their precision and reliability, particularly for exposures that occurred many decades ago. Here, we studied living U.S. military test participants, believed to have received high-dose radiation exposures during nuclear testing-related activities approximately six decades ago, with two primary goals in mind. The first was to compare three different approaches of assessing past radiation exposures: 1. Historical personnel monitoring data alone; 2. Dose reconstruction based on varying levels of completeness of individual information, which can include film badge data; and 3. Retrospective biodosimetry using chromosome aberrations in peripheral blood lymphocytes. The second goal was to use the collected data to make the best possible estimates of bone marrow dose received by a group with the highest military recorded radiation doses of any currently living military test participants. Six nuclear test participants studied had been on Rongerik Atoll during the 1954 CASTLE Bravo nuclear test. Another six were present at the Nevada Test Site (NTS) and/or Pacific Proving Ground (PPG) and were believed to have received relatively high-dose exposures at those locations. All were interviewed, and all provided a blood sample for cytogenetic analysis. Military dose records for each test participant, as recorded in the Defense Threat Reduction Agency's Nuclear Test Review and Information System, were used as the basis for historical film badge records and provided exposure scenario information to estimate dose via dose reconstruction. Dose to bone marrow was also estimated utilizing directional genomic hybridization (dGH) for high-resolution detection of radiation-induced chromosomal translocations and inversions, the latter being demonstrated for the first time for the purpose of retrospective biodosimetry. As the true dose for each test participant is not known these many decades after exposure, this study gauged the congruence of different methods by assessing the degree of correlation and degree of systematic differences. Overall, the best agreement between methods, defined by statistically significant correlations and small systematic differences, was between doses estimated by a dose reconstruction methodology that exploited all the available individual detail and the biodosimetry methodology derived from a weighted average dose determined from chromosomal translocation and inversion rates. Employing such a strategy, we found that the Rongerik veterans who participated in this study appear to have received, on average, bone marrow equivalent doses on the order of 300–400 mSv, while the NTS/ PPG participants appear to have received approximately 250–300 mSv. The results show that even for nuclear events that occurred six decades in the past, biological signatures of exposure are still present, and when taken together, chromosomal translocations and inversions can serve as reliable retrospective biodosimeters, particularly on a group-average basis, when doses received are greater than statistically-determined detection limits for the biological assays used.
Journal Articles
Journal Articles
Journal Articles
Journal Articles
Journal Articles
Journal:
Radiation Research
Radiation Research (2014) 181 (2): 208–228.
Published: 14 February 2014
Abstract
Polonium-210 is a naturally occurring radioactive element that decays by emitting an alpha particle. It is in the air we breathe and also a component of tobacco smoke. Polonium-210 is used as an anti-static device in printing presses and gained widespread notoriety in 2006 after the poisoning and subsequent death of a Russian citizen in London. More is known about the lethal effects of polonium-210 at high doses than about late effects from low doses. Cancer mortality was examined among 7,270 workers at the Mound nuclear facility near Dayton, OH where polonium-210 was used (1944–1972) in combination with beryllium as a source of neutrons for triggering nuclear weapons. Other exposures included external gamma radiation and to a lesser extent plutonium-238, tritium and neutrons. Vital status and cause of death was determined through 2009. Standardized mortality ratios (SMRs) were computed for comparisons with the general population. Lifetime occupational doses from all places of employment were sought and incorporated into the analysis. Over 200,000 urine samples were analyzed to estimate radiation doses to body organs from polonium and other internally deposited radionuclides. Cox proportional hazards models were used to evaluate dose-response relationships for specific organs and tissues. Vital status was determined for 98.7% of the workers of which 3,681 had died compared with 4,073.9 expected (SMR 0.90; 95% CI 0.88–0.93). The mean dose from external radiation was 26.1 mSv (maximum 939.1 mSv) and the mean lung dose from external and internal radiation combined was 100.1 mSv (maximum 17.5 Sv). Among the 4,977 radiation workers, all cancers taken together (SMR 0.86; 95% CI 0.79–0.93), lung cancer (SMR 0.85; 95% CI 0.74–0.98), and other types of cancer were not significantly elevated. Cox regression analysis revealed a significant positive dose-response trend for esophageal cancer [relative risk (RR) and 95% confidence interval at 100 mSv of 1.54 (1.15–2.07)] and a negative dose-response trend for liver cancer [RR (95% CI) at 100 mSv of 0.55 (0.23–1.32)]. For lung cancer the RR at 100 mSv was 1.00 (95% CI 0.97–1.04) and for all leukemias other than chronic lymphocytic leukemia (CLL) it was 1.04 (95% CI 0.63–1.71). There was no evidence that heart disease was associated with exposures [RR at 100 mSv of 1.06 (0.95–1.18)]. Assuming a relative biological effectiveness factor of either 10 or 20 for polonium and plutonium alpha particle emissions had little effect on the dose-response analyses. Polonium was the largest contributor to lung dose, and a relative risk of 1.04 for lung cancer at 100 mSv could be excluded with 95% confidence. A dose related increase in cancer of the esophagus was consistent with a radiation etiology but based on small numbers. A dose-related decrease in liver cancer suggests the presence of other modifying factors of risk and adds caution to interpretations. The absence of a detectable increase in total cancer deaths and lung cancer in particular associated with occupational exposures to polonium (mean lung dose 159.8 mSv), and to plutonium to a lesser extent (mean lung dose 13.7 mSv), is noteworthy but based on small numbers. Larger combined studies of U.S. workers are needed to clarify radiation risks following prolonged exposures and radionuclide intakes.
Journal Articles
Journal:
Radiation Research
Radiation Research (2012) 178 (3): 246–247.
Published: 01 September 2012
Journal Articles
Journal:
Radiation Research
Radiation Research (2012) 178 (4): 266–279.
Published: 02 August 2012
Abstract
The purpose of this study is to quantify cancer mortality in relationship to organ-specific radiation dose among women irradiated for benign gynecologic disorders. Included in this study are 12,955 women treated for benign gynecologic disorders at hospitals in the Northeastern U.S. between 1925 and 1965; 9,770 women treated by radiation and 3,186 women treated by other methods. The average age at treatment was 45.9 years (range, 13–88 years), and the average follow-up period was 30.1 years (maximum, 69.9 years). Radiation doses to organs and active bone marrow were reconstructed by medical physicists using original radiotherapy records. The highest doses were received by the uterine cervix (median, 120 Gy) and uterine corpus (median, 34 Gy), followed by the bladder, rectum and colon (median, 1.7–7.2 Gy), with other abdominal organs receiving median doses ≤1 Gy and organs in the chest and head receiving doses <0.1 Gy. Standardized mortality rate ratios relative to the general U.S. population were calculated. Radiation-related risks were estimated in internal analyses using Poisson regression models. Mortality was significantly elevated among irradiated women for cancers of the uterine corpus, ovary, bladder, rectum, colon and brain, as well as for leukemia (exclusive of chronic lymphocytic leukemia) but not for cancer of the cervix, Hodgkin or non-Hodgkin lymphoma, multiple myeloma, or chronic lymphocytic leukemia. Evidence of a dose-response was seen for cancers of the ovary [excess relative risk (ERR) = 0.31/Gy, P < 0.001], bladder (ERR = 0.21/Gy, P = 0.02) and rectum (ERR = 0.23/Gy, P = 0.05) and suggested for colon (ERR = 0.09/Gy, P = 0.10), but not for cancers of the uterine corpus or brain nor for non-chronic lymphocytic leukemia. Relative risks of mortality due to cancers of the stomach, pancreas, liver and kidney were close to 1.0, with no evidence of dose-response over the range of 0–1.5 Gy. Breast cancer was not significantly associated with dose to the breast or ovary. Mortality due to cancers of heavily irradiated organs remained elevated up to 40 years after irradiation. Significantly elevated radiation-related risk was seen for cancers of organs proximal to the radiation source or fields (bladder, rectum and ovary), as well as for non-chronic lymphocytic leukemia. Our results corroborate those from previous studies that suggest that cells of the uterine cervix and lymphopoietic system are relatively resistant to the carcinogenic effects of radiation. Studies of women irradiated for benign gynecologic disorders, together with studies of women treated with higher doses of radiation for uterine cancers, provide quantitative information on cancer risks associated with a broad range of pelvic radiation exposures.
Journal Articles
Journal:
Radiation Research
Radiation Research (2012) 178 (2): AV43–AV60.
Published: 01 August 2012
Abstract
The thyroid gland of children is especially vulnerable to the carcinogenic action of ionizing radiation. To provide insights into various modifying influences on risk, seven major studies with organ doses to individual subjects were evaluated. Five cohort studies (atomic bomb survivors, children treated for tinea capitis, two studies of children irradiated for enlarged tonsils, and infants irradiated for an enlarged thymus gland) and two case-control studies (patients with cervical cancer and childhood cancer) were studied. The combined studies include almost 120,000 people (approximately 58,000 exposed to a wide range of doses and 61,000 nonexposed subjects), nearly 700 thyroid cancers and 3,000,000 person years of follow-up. For persons exposed to radiation before age 15 years, linearity best described the dose response, even down to 0.10 Gy. At the highest doses (>10 Gy), associated with cancer therapy, there appeared to be a decrease or leveling of risk. For childhood exposures, the pooled excess relative risk per Gy (ERR/Gy) was 7.7 (95% CI = 2.1, 28.7) and the excess absolute risk per 10 4 PY Gy (EAR/10 4 PY Gy) was 4.4 (95% CI = 1.9, 10.1). The attributable risk percent (AR%) at 1 Gy was 88%. However, these summary estimates were affected strongly by age at exposure even within this limited age range. The ERR was greater ( P = 0.07) for females than males, but the findings from the individual studies were not consistent. The EAR was higher among women, reflecting their higher rate of naturally occurring thyroid cancer. The distribution of ERR over time followed neither a simple multiplicative nor an additive pattern in relation to background occurrence. Only two cases were seen within 5 years of exposure. The ERR began to decline about 30 years after exposure but was still elevated at 40 years. Risk also decreased significantly with increasing age at exposure, with little risk apparent after age 20 years. Based on limited data, there was a suggestion that spreading dose over time (from a few days to >1 year) may lower risk, possibly due to the opportunity for cellular repair mechanisms to operate. The thyroid gland in children has one of the highest risk coefficients of any organ and is the only tissue with convincing evidence for risk at about 0.10 Gy.
Journal Articles
Journal:
Radiation Research
Radiation Research (2011) 176 (2): 244–258.
Published: 07 March 2011
Abstract
Updated analyses of mortality data are presented on 46,970 workers employed 1948–1999 at Rocketdyne (Atomics International). Overall, 5,801 workers were involved in radiation activities, including 2,232 who were monitored for intakes of radionuclides, and 41,169 workers were engaged in rocket testing or other non-radiation activities. The worker population is unique in that lifetime occupational doses from all places of employment were sought, updated and incorporated into the analyses. Further, radiation doses from intakes of 14 different radionuclides were calculated for 16 organs or tissues using biokinetic models of the International Commission on Radiation Protection (ICRP). Because only negligible exposures were received by the 247 workers monitored for radiation activities after 1999, the mean dose from external radiation remained essentially the same at 13.5 mSv (maximum 1 Sv) as reported previously, as did the mean lung dose from external and internal radiation combined at 19.0 mSv (maximum 3.6 Sv). An additional 9 years of follow-up, from December 31,1999 through 2008, increased the person-years of observation for the radiation workers by 21.7% to 196,674 (mean 33.9 years) and the number of cancer deaths by 50% to 684. Analyses included external comparisons with the general population and the computation of standardized mortality ratios (SMRs) and internal comparisons using proportional hazards models and the computation of relative risks (RRs). A low SMR for all causes of death (SMR 0.82; 95% CI 0.78–0.85) continued to indicate that the Rocketdyne radiation workers were healthier than the general population and were less likely to die. The SMRs for all cancers taken together (SMR 0.88; 95% CI 0.81–0.95), lung cancer (SMR 0.87; 95% CI 0.76–1.00) and leukemia other than chronic lymphocytic leukemia (CLL) (SMR 1.04; 95% 0.67–1.53) were not significantly elevated. Cox regression analyses revealed no significant dose–response trends for any cancer. For all cancers excluding leukemia, the RR at 100 mSv was estimated as 0.98 (95% CI 0.82–1.17), and for all leukemia other than CLL it was 1.06 (95% CI 0.50–2.23). Uranium was the primary radionuclide contributing to internal exposures, but no significant increases in lung and kidney disease were seen. The extended follow-up reinforces the findings in the previous study in failing to observe a detectable increase in cancer deaths associated with radiation, but strong conclusions still cannot be drawn because of small numbers and relatively low career doses. Larger combined studies of early workers in the United States using similar methodologies are warranted to refine and clarify radiation risks after protracted exposures.
Journal Articles
Journal:
Radiation Research
Radiation Research (2010) 174 (5): 624–636.
Published: 13 September 2010
Abstract
In a previous cohort study of workers engaged in uranium milling and mining activities near Grants, Cibola County, New Mexico, we found lung cancer mortality to be significantly increased among underground miners. Uranium mining took place from early in the 1950s to 1990, and the Grants Uranium Mill operated from 1958–1990. The present study evaluates cancer mortality during 1950–2004 and cancer incidence during 1982–2004 among county residents. Standardized mortality (SMR) and incidence (SIR) ratios and 95% confidence intervals (CI) were computed, with observed numbers of cancer deaths and cases compared to expected values based on New Mexico cancer rates. The total numbers of cancer deaths and incident cancers were close to that expected (SMR 1.04, 95% CI 1.01–1.07; SIR 0.97, 95% CI 0.92–1.02). Lung cancer mortality and incidence were significantly increased among men (SMR 1.11, 95% CI 1.02–1.21; SIR 1.40, 95% CI 1.18–1.64) but not women (SMR 0.97, 95% CI 0.85–1.10; SIR 1.01, 95% CI 0.78–1.29). Similarly, among the population of the three census tracts near the Grants Uranium Mill, lung cancer mortality was significantly elevated among men (SMR 1.57; 95% CI 1.21–1.99) but not women (SMR 1.12; 95% CI 0.75–1.61). Except for an elevation in mortality for stomach cancer among women (SMR 1.30; 95% CI 1.03–1.63), which declined over the 55-year observation period, no significant increases in SMRs or SIRs for 22 other caners were found. Although etiological inferences cannot be drawn from these ecological data, the excesses of lung cancer among men seem likely to be due to previously reported risks among underground miners from exposure to radon gas and its decay products. Smoking, socioeconomic factors or ethnicity may also have contributed to the lung cancer excesses observed in our study. The stomach cancer increase was highest before the uranium mill began operation and then decreased to normal levels. With the exception of male lung cancer, this study provides no clear or consistent evidence that the operation of uranium mills and mines adversely affected cancer incidence or mortality of county residents.
Journal Articles
Journal:
Radiation Research
Radiation Research (2010) 174 (3): 367–376.
Published: 28 June 2010
Abstract
Mammography screening is an accepted procedure for early detection of breast tumors among asymptomatic women. Since this procedure involves the use of X rays, it is itself potentially carcinogenic. Although there is general consensus about the benefit of screening for older women, screening practices differ between countries. In this paper radiation risks for these different practices are estimated using a new approach. We model breast cancer induction by ionizing radiation in a cohort of patients exposed to frequent X-ray examinations. The biologically based, mechanistic model provides a better foundation for the extrapolation of risks to different mammography screening practices than empirical models do. The model predicts that the excess relative risk (ERR) doubles when screening starts at age 40 instead of 50 and that a continuation of screening at ages 75 and higher carries little extra risk. The number of induced fatal breast cancers is estimated to be considerably lower than derived from epidemiological studies and from internationally accepted radiation protection risks. The present findings, if used in a risk-benefit analysis for mammography screening, would be more favorable to screening than estimates currently recommended for radiation protection. This has implications for the screening ages that are currently being reconsidered in several countries.
Journal Articles
Journal Articles
Journal:
Radiation Research
Radiation Research (2007) 167 (6): 711–726.
Published: 01 June 2007
Abstract
Boice, J. D., Jr., Mumma, M. T. and Blot, W. J. Cancer and Noncancer Mortality in Populations Living Near Uranium and Vanadium Mining and Milling Operations in Montrose County, Colorado, 1950–2000. Radiat. Res. 167, 711–726 (2007). Mining and milling of uranium in Montrose County on the Western Slope of Colorado began in the early 1900s and continued until the early 1980s. To evaluate the possible impact of these activities on the health of communities living on the Colorado Plateau, mortality rates between 1950 and 2000 among Montrose County residents were compared to rates among residents in five similar counties in Colorado. Standardized mortality ratios (SMRs) were computed as the ratio of observed numbers of deaths in Montrose County to the expected numbers of deaths based on mortality rates in the general populations of Colorado and the United States. Relative risks (RRs) were computed as the ratio of the SMRs for Montrose County to the SMRs for the five comparison counties. Between 1950 and 2000, a total of 1,877 cancer deaths occurred in the population residing in Montrose County, compared with 1,903 expected based on general population rates for Colorado (SMR CO 0.99). There were 11,837 cancer deaths in the five comparison counties during the same 51-year period compared with 12,135 expected (SMR CO 0.98). There was no difference between the total cancer mortality rates in Montrose County and those in the comparison counties (RR = 1.01; 95% CI 0.96–1.06). Except for lung cancer among males (RR = 1.19; 95% CI 1.06–1.33), no statistically significant excesses were seen for any causes of death of a priori interest: cancers of the breast, kidney, liver, bone, or childhood cancer, leukemia, non-Hodgkin lymphoma, renal disease or nonmalignant respiratory disease. Lung cancer among females was decreased (RR = 0.83; 95% CI 0.67–1.02). The absence of elevated mortality rates of cancer in Montrose County over a period of 51 years suggests that the historical milling and mining operations did not adversely affect the health of Montrose County residents. Although descriptive correlation analyses such as this preclude definitive causal inferences, the increased lung cancer mortality seen among males but not females is most likely due to prior occupational exposure to radon and cigarette smoking among underground miners residing in Montrose County, consistent with previous cohort studies of Colorado miners and of residents of the town of Uravan in Montrose County.
Journal Articles
Journal:
Radiation Research
Radiation Research (2006) 166 (1): 98–115.
Published: 01 July 2006
Abstract
Boice, Jr., J. D., Cohen, S. S., Mumma, M. T., Ellis, E. D., Eckerman, K. F., Leggett, R. W., Boecker, B. B., Brill, A. B. and Henderson, B. E. Mortality among Radiation Workers at Rocketdyne (Atomics International), 1948–1999. Radiat. Res. 166, 98–115 (2006). A retrospective cohort mortality study was conducted of workers engaged in nuclear technology development and employed for at least 6 months at Rocketdyne (Atomics International) facilities in California, 1948–1999. Lifetime occupational doses were derived from company records and linkages with national dosimetry data sets. International Commission on Radiation Protection (ICRP) biokinetic models were used to estimate radiation doses to 16 organs or tissues after the intake of radionuclides. Standardized mortality ratios (SMRs) compared the observed numbers of deaths with those expected in the general population of California. Cox proportional hazards models were used to evaluate dose–response trends over categories of cumulative radiation dose, combining external and internal organ-specific doses. There were 5,801 radiation workers, including 2,232 monitored for radionuclide intakes. The mean dose from external radiation was 13.5 mSv (maximum 1 Sv); the mean lung dose from external and internal radiation combined was 19.0 mSv (maximum 3.6 Sv). Vital status was determined for 97.6% of the workers of whom 25.3% ( n = 1,468) had died. The average period of observation was 27.9 years. All cancers taken together (SMR 0.93; 95% CI 0.84–1.02) and all leukemia excluding chronic lymphocytic leukemia (CLL) (SMR 1.21; 95% CI 0.69–1.97) were not significantly elevated. No SMR was significantly increased for any cancer or for any other cause of death. The Cox regression analyses revealed no significant dose–response trends for any cancer. For all cancers excluding leukemia, the RR at 100 mSv was estimated as 1.00 (95% CI 0.81–1.24), and for all leukemia excluding CLL it was 1.34 (95% CI 0.73–2.45). The nonsignificant increase in leukemia (excluding CLL) was in accord with expectation from other radiation studies, but a similar nonsignificant increase in CLL (a malignancy not found to be associated with radiation) tempers a causal interpretation. Radiation exposure has not caused a detectable increase in cancer deaths in this population, but results are limited by small numbers and relatively low career doses.
Journal Articles
Journal:
Radiation Research
Radiation Research (2003) 160 (6): 691–706.
Published: 01 December 2003
Abstract
Travis, L. B., Hauptmann, M., Gaul, L. K., Storm, H. H., Goldman, M. B., Nyberg, U., Berger, E., Janower, M. L., Hall, P., Monson, R. R., Holm, L-E., Land, C. E., Schottenfeld, D., Boice, J. D., Jr. and Andersson, M. Site-Specific Cancer Incidence and Mortality after Cerebral Angiography with Radioactive Thorotrast. Radiat. Res. 160, 691–706 (2003). Few opportunities exist to evaluate the carcinogenic effects of long-term internal exposure to α-particle-emitting radionuclides. Patients injected with Thorotrast (thorium-232) during radiographic procedures, beginning in the 1930s, provide one such valuable opportunity. We evaluated site-specific cancer incidence and mortality among an international cohort of 3,042 patients injected during cerebral angiography with either Thorotrast ( n = 1,650) or a nonradioactive agent ( n = 1,392) and who survived 2 or more years. Standardized incidence ratios (SIR) for Thorotrast and comparison patients (Denmark and Sweden) were estimated and relative risks (RR), adjusted for population, age and sex, were generated with multivariate statistical modeling. For U.S. patients, comparable procedures were used to estimate standardized mortality ratios (SMR) and RR, representing the first evaluation of long-term, site-specific cancer mortality in this group. Compared with nonexposed patients, significantly increased risks in Thorotrast patients were observed for all incident cancers combined (RR = 3.4, 95% CI 2.9–4.1, n = 480, Denmark and Sweden) and for cancer mortality (RR = 4.0, 95% CI 2.5–6.7, n = 114, U.S.). Approximately 335 incident cancers were above expectation, with large excesses seen for cancers of the liver, bile ducts and gallbladder (55% or 185 excess cancers) and leukemias other than CLL (8% or 26 excess cancers). The RR of all incident cancers increased with time since angiography ( P < 0.001) and was threefold at 40 or more years; significant excesses (SIR = 4.0) persisted for 50 years. Increasing cumulative dose of radiation was associated with an increasing risk of all incident cancers taken together and with cancers of the liver, gallbladder, and peritoneum and other digestive sites; similar findings were observed for U.S. cancer mortality. A marginally significant dose response was observed for the incidence of pancreas cancer ( P = 0.05) but not for lung cancer. Our study confirms the relationship between Thorotrast and increased cancer incidence at sites of Thorotrast deposition and suggests a possible association with pancreas cancer. After injection with >20 ml Thorotrast, the cumulative excess risk of cancer incidence remained elevated for up to 50 years and approached 97%. Caution is needed in interpreting the excess risks observed for site-specific cancers, however, because of the potential bias associated with the selection of cohort participants, noncomparability with respect to the internal or external comparison groups, and confounding by indication. Nonetheless, the substantial risks associated with liver cancer and leukemia indicate that unique and prolonged exposure to α-particle-emitting Thorotrast increased carcinogenic risks.
Journal Articles
Journal:
Radiation Research
Radiation Research (2002) 158 (2): 220–235.
Published: 01 August 2002
Abstract
Preston, D. L., Mattsson, A., Holmberg, E., Shore, R., Hildreth, N. G. and Boice, J. D., Jr. Radiation Effects on Breast Cancer Risk: A Pooled Analysis of Eight Cohorts. Radiat. Res. 158, 220–235 (2002). Breast cancer incidence rates after radiation exposure in eight large cohorts are described and compared. The nature of the exposures varies appreciably, ranging from a single or a small number of high-dose-rate exposures (Japanese atomic bomb survivors, U.S. acute post-partum mastitis patients, Swedish benign breast disease patients, and U.S. infants with thymic enlargement) to highly fractionated high-dose-rate exposures (two U.S. tuberculosis cohorts) and protracted low-dose-rate exposure (two Swedish skin hemangioma cohorts). There were 1,502 breast cancers among 77,527 women (about 35,000 of whom were exposed) with 1.8 million woman-years of follow-up. The excess risk depends linearly on dose with a downturn at high doses. No simple unified summary model adequately describes the excess risks in all groups. Excess risks for the thymus, tuberculosis, and atomic bomb survivor cohorts have similar temporal patterns, depending on attained age for relative risk models and on both attained age and age at exposure for excess rate models. Excess rates were similar in these cohorts, whereas, related in part to the low breast cancer background rates for Japanese women, the excess relative risk per unit dose in the bomb survivors was four times that in the tuberculosis or thymus cohorts. Excess rates were higher for the mastitis and benign breast disease cohorts. The hemangioma cohorts showed lower excess risks suggesting ameliorating dose-rate effects for protracted low-dose-rate exposures. For comparable ages at exposure (∼0.5 years), the excess risk in the hemangioma cohorts was about one-seventh that in the thymus cohort, whose members received acute high-dose-rate exposures. The results support the linearity of the radiation dose response for breast cancer, highlight the importance of age and age at exposure on the risks, and suggest a similarity in risks for acute and fractionated high-dose-rate exposures with much smaller effects from low-dose-rate protracted exposures. There is also a suggestion that women with some benign breast conditions may be at elevated risk of radiation-associated breast cancer.
Journal Articles
Journal:
Radiation Research
Radiation Research (2002) 157 (4): 483–489.
Published: 01 April 2002
Abstract
Sigurdson, A. J., Stovall, M., Kleinerman, R. A., Maor, M. H., Taylor, M. E., Boice, J. D., Jr. and Ron, E. Feasibility of Assessing the Carcinogenicity of Neutrons among Neutron Therapy Patients. Radiat. Res. 157, 483–489 (2002). Nuclear workers, oil well loggers, astronauts, air flight crews, and frequent fliers can be exposed to low doses of neutrons, but the long-term human health consequences of neutron exposure are unknown. While few of these exposed populations are suitable for studying the effects of neutron exposure, patients treated with neutron-beam therapy might be a source of information. To assess the feasibility of conducting a multi-center international study of the late effects of neutron therapy, we surveyed 23 cancer centers that had used neutron beam therapy. For the 17 responding institutions, only 25% of the patients treated with neutrons (2,855 of 11,191) were alive more than 2 years after treatment. In a two-center U.S. pilot study of 484 neutron-treated cancer patients, we assessed the feasibility of obtaining radiotherapy records, cancer incidence and other follow-up data, and of estimating patient organ doses. Patients were treated with 42 MeV neutrons between 1972 and 1989. Applying a clinical equivalence factor of 3.2 for neutrons, total average organ doses outside the treatment beam ranged from 0.14 to 0.29 Gy for thyroid, 0.40 to 2.50 Gy for breast, 0.63 to 2.35 Gy for kidney, and 1.12 to 1.76 Gy for active bone marrow depending upon the primary cancer treatment site. We successfully traced 97% of the patients, but we found that patient survival was poor and that chemotherapy was not confirmable in a quarter of the patients. Based on our findings from the international survey and the feasibility study, we conclude that a large investigation could detect a fivefold or higher leukemia risk, but would be inadequate to evaluate the risk of solid cancers with long latent periods and therefore would likely not be informative with respect to neutron-related cancer risk in humans.
Journal Articles
Journal:
Radiation Research
Radiation Research (2001) 156 (6): 718–723.
Published: 01 December 2001
Abstract
Naumburg, E., Bellocco, R., Cnattingius, S., Hall, P., Boice, J. D. Jr. and Ekbom, A. Intrauterine Exposure to Diagnostic X Rays and Risk of Childhood Leukemia Subtypes. Radiat. Res. 156, 718–723 (2001). The relationship between childhood leukemia and prenatal exposure to low-dose ionizing radiation remains debatable. This population-based case–control study investigated the association between prenatal exposure to diagnostic X-ray examinations (for different types of examinations and at different stages of pregnancy) and the risk of childhood lymphatic and myeloid leukemia. All children born and diagnosed with leukemia between 1973–1989 in Sweden (578 lymphatic and 74 myeloid) were selected as cases, and each was matched (by sex and year of birth) to a healthy control child (excluding Down's syndrome). Exposure data were abstracted blindly from all available medical records. Odds ratios (OR) and 95% confidence intervals (CI) were calculated by conditional logistic regression. It was found that prenatal X-ray examinations resulting in direct fetal exposure were not associated with a significant overall increased risk for childhood leukemia (OR = 1.11, 95% CI 0.83–1.47), for lymphatic leukemia (OR = 1.04, 95% CI 0.77–1.40), or for myeloid leukemia (OR = 1.49, 95% CI 0.48–4.72). There was little evidence of a dose response or variation in risk by trimester of exposure or age at diagnosis. Thus X-ray examinations performed during pregnancy in the 1970s and 1980s in Sweden did not affect the risk of childhood leukemia discernibly.
Journal Articles
Journal:
Radiation Research
Radiation Research (2001) 156 (2): 136–150.
Published: 01 August 2001
Abstract
Travis, L. B., Land, C. E., Andersson, M., Nyberg, U., Goldman, M. B., Knudson Gaul, L., Berger, E., Storm, H. H., Hall, P., Auvinen, A., Janower, M. L., Holm, L-E., Monson, R. R., Schottenfeld D. and Boice, J. D., Jr. Mortality after Cerebral Angiography with or without Radioactive Thorotrast: An International Cohort of 3,143 Two-Year Survivors. Radiat. Res. 156, 136–150 (2001). There are few studies on the long-term sequelae of radionuclides ingested or injected into the human body. Patients exposed to radioactive Thorotrast in the 1930s through the early 1950s provide a singular opportunity, since the administration of this radiographic contrast agent resulted in continuous exposure to α particles throughout life at a low dose rate. We evaluated cause-specific mortality among an international cohort of 3,143 patients injected during cerebral angiography with either Thorotrast ( n = 1,736) or a similar but nonradioactive agent ( n = 1,407) and who survived 2 or more years. Standardized mortality ratios (SMRs) for Thorotrast and comparison patients were calculated, and relative risks (RR), adjusted for population, age and sex, were obtained by multivariate statistical modeling. Most patients were followed until death, with only 94 (5.4%) of the Thorotrast patients known to be alive at the closure of the study. All-cause mortality ( n = 1,599 deaths) was significantly elevated among Thorotrast subjects [RR 1.7; 95% confidence interval (CI) 1.5–1.8]. Significantly increased relative risks were found for several categories, including cancer (RR 2.8), benign and unspecified tumors (RR 1.5), benign blood diseases (RR 7.1), and benign liver disorders (RR 6.5). Nonsignificant increases were seen for respiratory disease (RR 1.4) and other types of digestive disease (RR 1.6). The relative risk due to all causes increased steadily after angiography to reach a threefold RR at 40 or more years ( P < 0.001). Excess cancer deaths were observed for each decade after Thorotrast injection, even after 50 years (SMR 8.6; P < 0.05). Increasing cumulative dose of radiation was directly associated with death due to all causes combined, cancer, respiratory disease, benign liver disease, and other types of digestive disease. Our study confirms the relationship between Thorotrast and increased mortality due to cancer, benign liver disease, and benign hematological disease, and suggests a possible relationship with respiratory disorders and other types of digestive disease. The cumulative excess risk of cancer death remained high up to 50 years after injection with >20 ml Thorotrast and approached 50%.