After a planned or unplanned radiation exposure, determination of absorbed dose has great clinical importance, informing treatment and triage decisions in the exposed individuals. Biodosimetry approaches allow for determination of dose in the absence of physical measurement apparatus. The current state-of-the-art biodosimetry method is based on the frequency of induced dicentric chromosomes in peripheral blood T cells, which is proportional to the absorbed radiation dose. Since dose-response curves used for obtaining absorbed dose for humans are based on data sourced from in vitro studies, a concerning discrepancy may be present in the reported dose. Specifically, T-cell survival after in vitro irradiation is much higher than that measured in humans in vivo and, in addition, is not dose dependent over some dose ranges. We hypothesized that these differences may lead to inappropriately inflated dicentric frequencies after in vitro irradiation when compared with in vivo irradiation of the same samples. This may lead to underestimation of the in vivo dose. To test this hypothesis, we employed the humanized mouse model, which allowed direct comparison of cell depletion and dicentric frequencies in human T cells irradiated in vivo and in vitro. The results showed similar dicentric chromosome induction frequencies measured in vivo and in vitro when assessed 24 h postirradiation despite the differences in cell survival. These results appear to validate the use of in vitro data for the estimation of the absorbed dose in human radiation biodosimetry.

You do not currently have access to this content.