For the 40 years that I have been involved in radiation biology, I have been told that the discipline, indeed almost the entire spectrum of radiation sciences, is disappearing and that radiation researchers themselves are a “dying breed”. On a personal level, the evolution over that time has been a transition from “they say” to “I say”, as the ever worsening crisis has become evident from both formal (13 ) and informal studies.2 Indeed, positioning the Radiation Research Society to better face this situation was a key platform during my time as President. I am obviously not alone in this concern, with many making stalwart and repeated efforts to identify the underlying causes with the hope of mitigating the decline. In general, their conclusions have included both radiation-specific factors [e.g., loss of specialized training programs (4, 5 ); significant reductions in career opportunities in radiation biology and related disciplines (5 ); competition from and/or a lack of appreciation by other sciences/scientists (4, 5 ), etc.], as well as other influences that can be applied currently across a plethora of disciplines [e.g., the overall reduction in funding opportunities (6, 7 ); a reduction in the public respect for scientists and science, etc.]. However, little has seemed to stem the tide so, at this time, I beg your collective indulgence to offer a reflection on this topic, with a focus on what I believe is a previously unacknowledged issue. My observation comes about as a result of participation on multiple review panels, in combination with my specific experience over the past two years when, through three iterations, I have been (unsuccessfully) submitting a P01 proposal. Despite the limited number of criteria needed for a successful grant application (8 ), the overall low level of success suffered by so many of us led me to contemplate the role that our members may play in peer review panels, with the realization that we, as radiation researchers, may be living in a “crab bucket”.

So, what is a crab bucket? I first became aware of this concept while reading one of my favorite authors, Terry Pratchett (9 ), although a friend remarked recently that the idea may have originated in India with the penchant of the British Raj for seafood. Whatever its origins, the essence of the crab bucket metaphor is this: you can carry a large number of crabs in an open bucket without fear of any escaping. This is because as fast as any one crab attempts to climb out, the others drag it back down; the group as a whole prevents the progress of any single individual. Although certainly not a phenomenon peculiar to radiation biology alone, personal observation indicates that our discipline is particularly prone to this mentality when faced with scoring a peer's work. As a means of comparison, while participating in reviews of other small scientific niches, I have seen “blind” support of a peer's research; scores of 1s across the board, irrespective of the quality of the research, with the subject expert taking on the role of champion and brooking no dissent from the rest of the panel. Such support may be considered psychologically as a “rising tide” mentality, made in anticipation that if a peer in the same subspecialty is successful, more money will follow. Of course, alternatively, this might reflect a subconscious hope for a quid pro quo (10 ) or offer evidence of cronyism (11 ) since, in many of the smaller research areas, it is relatively easy to identify one's peer on a review panel. However, in contrast, when considering more recent radiation biology reviews, I have seen the reverse occur, with the “expert” initiating or supporting a veritable blood-bath. If this perception is true, the question arises as to why some might feel the need to treat a peer's work so harshly?

One explanation is that, as the number of participants in our field has declined, there has been a parallel fall in the number of available experts to perform reviews. Although this could have resulted in the cronyism described above, when available NIH funding is also on the decline (12, 13 ), the combination has likely resulted in an excessive feeling of competition (14 ), so that the grant evaluation process then becomes an extension of the perceived hierarchical relationship between the assigned reviewer and the applicant (14, 15 ). Interestingly, while discussing this topic with others, a source at the NIH3 commented that when described as the “subject matter expert”, radiation biologists seem inclined to demonstrate their expertise to the rest of the panel by providing excessive criticism, rather than championing the application. A number of studies have shown that such actions have a detrimental effect on funding potential, especially when a review panel includes a high number of non-experts or where there is disagreement regarding the quality of the submission (16 ); on a personal note, being older (16 ) and/or female (17 ) further, and significantly, increases the funding disadvantage of the applicant.

Of course, some readers may dismiss these observations as an example of sour grapes since it provides an excuse for my failure by blaming others. However, I firmly believe that there is something rotten in the state of radiation biology4 and, at a time when the workforce of radiation research professionals persists in its decline and funding resources continue to shrink, the often onerous role of reviewer becomes of paramount importance. Therefore, I would ask that all who accept this role consider the possible existence of subtle, subconscious biases that may affect your view of your fellow scientists' work. As a counterbalance to these potential biases, I offer some additional criteria that might be added to your personal review criteria:

  • When acting as the subject expert, focus on feasibility, appropriate methodologies and techniques; importantly, overcome personal disagreements with proposed hypotheses, use of alternative models (unless obviously inappropriate), etc. This is especially important if you are a senior researcher. Try and take a broader view; indeed, disagreements with radiation dogma should be supported if soundly argued – how else can we move our field forward?

  • If you are relatively junior when asked to be a reviewer, assess your assignments as you would wish yours to be assessed. There is no need to demonstrate your expertise – to be on the panel, you must already be considered an expert;

  • Although cognizant of the amount of work involved, do not simply focus on weaknesses, but make every effort to provide feedback on how to improve the grant. Importantly, strongly encourage the inclusion of a qualified radiation worker on any investigative team; not only will the grant benefit, but the importance of a knowledgeable radiation scientist will be emphasized and may, in one small way, ensure the continuation of our profession.

I am certainly not advocating that an application's merit is artificially inflated, but that adequate weight is given to the positive aspects of a proposed investigation. Above all, be kind to your fellow radiation researchers; after all, it only takes one crab to prevent an entire bucket from escaping.

ACKNOWLEDGEMENTS

I would like to thank those of you who took the time to read this commentary and provide me with feedback. In deference to your collective positions in the field and the Radiation Research Society, I will allow you all to remain anonymous.

REFERENCES

REFERENCES
Wallner
PE,
Anscher
MS,
Barker
CA,
Bassetti
M,
Bristow
RG,
Cha
YI,
et al.
Current status and recommendations for the future of research, teaching, and testing in the biological sciences of radiation oncology: report of the American Society for Radiation Oncology Cancer Biology/Radiation Biology Task Force, executive summary.
Int J Radiat Oncol Biol Phys
2014
;
88
(
1
):
11
7
.
Steinberg
M,
McBride
WH,
Vlashi
E,
Pajonk
F.
National Institutes of Health funding in radiation oncology: a snapshot.
Int J Radiat Oncol Biol Phys
2013
;
86
(
2
):
234
40
.
Rosenstein
BS,
Held
KD,
Rockwell
S,
Williams
JP,
Zeman
EM.
American Society for Radiation Oncology (ASTRO) survey of radiation biology educators in U.S. and Canadian radiation oncology residency programs.
Int J Radiat Oncol Biol Phys
2009
;
75
(
3
):
896
905
.
Coleman
CN,
Stone
HB,
Alexander
GA,
Barcellos-Hoff
MH,
Bedford
JS,
Bristow
RG,
et al.
Education and training for radiation scientists: radiation research program and American Society of Therapeutic Radiology and Oncology Workshop, Bethesda, Maryland, May 12-14, 2003.
Radiat Res
2003
;
160
(
6
):
729
37
.
Dynlacht
JR,
Zeman
EM,
Held
KD,
Deye
J,
Vikram
B,
Joiner
MC.
Education and training needs in the radiation sciences: Problems and potential solutions.
Radiat Res
2015
;
184
(
5
):
449
55
.
Chetlen
AL,
Degnan
AJ,
Guelfguat
M,
Griffith
B,
Itri
J,
Matta
H,
et al.
Radiology research funding: Current state and future opportunities.
Acad Radiol
2018
;
25
(
1
):
26
39
.
Squitieri
L,
Chung
KC.
Funding research in the twenty-first century: current opinions and future directions.
Hand Clin
2014
;
30
(
3
):
367
76
.
Falk-Krzesinski
HJ,
Tobin
SC.
How do I review thee? Let me count the ways: A comparison of research grant proposal review criteria across US federal funding agencies.
J Res Adm
2015
;
46
(
2
):
79
94
.
Pratchett
T.
Unseen academicals.
New York, NY
:
Harper Collins
;
2009
.
Gächter
S,
Herrmann
B.
Reciprocity, culture and human cooperation: previous insights and a new cross-cultural experiment.
Philos Trans R Soc Lond B Biol Sci
2009
;
364
(
1518
):
791
806
.
Guthrie
S,
Ghiga
I,
Wooding
S.
What do we know about grant peer review in the health sciences?
F1000Res
2017
;
6
:
1335
.
Oh
YS,
Robinson
V,
Stanley
DV,
Tolunay
E,
Kim
DY,
Galis
ZS.
The good old R01”: Challenging downward funding success trends at the National Heart, Lung, and Blood Institute.
Circ Res.
2016
;
118
(
10
):
1475
9
.
Alberts
B,
Hyman
T,
Pickett
CL,
Tilghman
S,
Varmus
H.
Improving support for young biomedical scientists.
Science
2018
;
360
(
6390
):
716
8
.
Bianchi
F,
Grimaldo
F,
Bravo
G,
Squazzoni
F.
The peer review game: an agent-based model of scientists facing resource constraints and institutional pressures.
Scientometrics
2018
;
116
(
3
):
1401
20
.
Pier
EL,
Brauer
M,
Filut
A,
Kaatz
A,
Raclaw
J,
Nathan
MJ,
et al.
Low agreement among reviewers evaluating the same NIH grant applications.
Proc Natl Acad Sci U S A
2018
;
115
(
12
):
2952
7
.
Tamblyn
R,
Girard
N,
Qian
CJ,
Hanley
J.
Assessment of potential bias in research grant peer review in Canada.
CMAJ
2018
;
190
(
16
):
E489
99
.
Witteman
HO,
Hendricks
M,
Straus
S,
Tannenbaum
C.
Are gender gaps due to evaluations of the applicant or the science? A natural experiment at a national funding agency.
Lancet
2019
;
393
(
10171
):
531
40
.

2 The author is Co-Chair of the National Council for Radiation Protection and Measurement (NCRP) Council Committee 2 on Meeting the Needs of the Nation for Radiation Protection.

3 In response to current administration practice, my source will remain unidentified so as to protect against any institutional retribution.

4 Apologies to William Shakespeare.