Training load and movement quality are associated with injury risk in athletes. Given these associations, it is important to understand how movement quality may moderate the training load so that appropriate injury-prevention strategies can be used.
To determine how absolute and relative internal training loads change during a men's National Collegiate Athletic Association (NCAA) soccer season and how movement quality, assessed using the Landing Error Scoring System (LESS), moderates the relative internal training load.
Prospective cohort study.
Division I athletics.
One NCAA Division I male collegiate soccer team was recruited and followed over 2 consecutive seasons. Fifty-two athletes (age = 19.71 ± 1.30 years, height = 1.81 ± 0.06 m, mass = 75.74 ± 6.64 kg) consented to participate, and 46 met the criteria to be included in the final statistical analysis.
Daily absolute internal training load was tracked over 2 seasons using a rated perceived exertion scale and time, which were subsequently used to calculate the absolute and relative internal training loads. Movement quality was assessed using the LESS and participants were categorized as poor movers (LESS score ≥5) or good movers (LESS score ≤4).
The 46 athletes consisted of 29 poor movers and 17 good movers. Absolute (P < .001) and relative (P < .001) internal training loads differed across the weeks of the season. However, movement quality did not moderate the relative internal training load (P = .264).
Absolute and relative training loads changed across weeks of a male collegiate soccer season. Movement quality did not affect the relative training load, but future researchers need to conduct studies with larger sample sizes to confirm this result.
Absolute and relative internal training loads changed throughout a Division I collegiate male soccer season.
In male collegiate soccer players, movement quality did not moderate the relative internal training load.
Playing soccer is associated with an inherently high injury risk. Incidence rates range from 2.0 to 19.4 and 2.48 to 9.4 injuries per hour of exposure for youth and professional soccer athletes, respectively.1,2 These injuries usually manifest as sprains and strains to the upper leg, ankle, and knee.1 As a result, injury prevention and risk identification are major roles for sports medicine professionals.3–5 Investigators have focused in part on examining modifiable risk factors in athletes in an effort to mitigate the injury risk.6–8
Over the past 15 years, a growing body of literature7–13 has demonstrated that training load is associated with injury risk. For example, an increased risk of noncontact soft tissue injury was observed with high weekly internal training loads (ie, absolute training loads) relative to lower training loads in professional rugby league players.12 Similarly, spikes in internal training loads from week to week were also associated with injury risk in Australian rules football players.13 An absolute training load is typically defined as the amount of work experienced during a given event.14 In addition, relative internal training loads (ie, acute:chronic workload ratio) have been associated with changes in injury risk among professional soccer players.10,15 A relative training load describes an individual's most recent work load relative to the workload over the past several weeks, and the injury risk is increased when the ratio exceeds 1.5.16 The relationship between training load and injury clearly indicates the importance of understanding how absolute and relative training loads change during a season, as such changes may provide insight into high-risk periods that may require injury-prevention strategies. Our understanding of training load during a collegiate soccer season continues to grow,17–22 but no investigators to date have reported week-to-week data over multiple seasons.
In addition, it is important to understand factors that may moderate one's internal training load response. Authors23,24 have shown that movement quality is a risk factor for injury in soccer athletes. Poor movement quality can be accurately and reliably assessed via the Landing Error Scoring System (LESS),25,26 and poor performance on the LESS is associated with an increased injury risk in soccer athletes.24 Individuals who have poor movement quality may experience greater mechanical stress during training and competition, which may in turn result in greater perceived exertion and a higher internal training load. However, researchers have not determined whether movement quality moderates the training load response during an athletic season.
Therefore, our objectives in this investigation were to determine how (1) absolute and relative internal training loads changed during a men's National Collegiate Athletic Association (NCAA) Division I soccer season and (2) movement quality, assessed using the LESS, moderated the relative internal training load during a competitive season. Based on the existing literature, we hypothesized that the absolute and relative internal training loads would vary during a competitive men's Division I collegiate soccer season. Furthermore, we hypothesized that athletes who demonstrated poor movement quality (ie, higher LESS scores) would have a higher risk of experiencing a relative internal training load spike during the season.
METHODS
Participants
One NCAA Division I male soccer team was recruited to participate for 2 consecutive competitive seasons. Each season, before participating and after reading the study's purpose, members of the soccer team provided written consent using a form approved by the university institutional review board. A total of 32 individuals consented to participate, 20 in both seasons, resulting in 52 athlete data points. Participant demographics are found in Table 1. Primary playing status (ie, starter versus reserve) and playing position (goalkeeper, defender, midfielder, or forward) were recorded for each person in each season (Table 2).
Training Load
We operationally defined internal training load as the work experienced by an athlete during a given training or game session.14 We used a modified Borg Rating of Perceived Exertion (RPE) in which 1 = resting and 10 = maximal effort.13–15 All session RPE (sRPE) data were obtained by a team athletic trainer (AT) within 30 minutes12 of a soccer event ending. The AT showed the athlete the modified Borg scale and asked, “How did you feel today's [training session, lift, game] was, from start to finish?”14
The weekly absolute internal training load represents the sum of the daily training load over a 7-day period (ie, Monday to Sunday) and was calculated for each athlete for every week of a season. The weekly absolute internal training load data were then used to create a relative training load. This value is often described as an acute: chronic workload ratio. Relative internal training load is the ratio between an athlete's acute internal training load (ie, most recent weekly absolute internal training load) and chronic load (ie, average weekly absolute training load over the past 4 weeks).15,27 Because NCAA rules prohibit athlete contact before the start of the season, no training load data were acquired during the off-season. As a result, we did not calculate relative internal training load values until the 5th week of each season. Injuries were recorded but did not necessarily result in an athlete being removed from the study unless he missed ≥6 weeks of participation.
Landing Error Scoring System
The LESS-17 was used to assess participant movement quality (ie, movement errors) at the beginning of each competitive season, before the first day of practice. Higher LESS scores (ie, more errors) indicate poorer movement quality and are associated with high-risk lower extremity kinematic (eg, decreased knee and hip flexion) and kinetic (eg, increased anterior tibial shear force) patterns.26 Three successful trials were recorded after at least 1 practice trial.24–26 To complete the LESS, participants jumped down off a 30-cm-tall box to a designated landing area located 90 cm in front of the box before immediately completing a maximal vertical jump. Trials were discarded and repeated if the participant failed to (1) jump off the box with both feet at the same time, (2) hit the landing area, (3) jump vertically after the initial landing, or (4) complete the task in a smooth motion.24–26,28
The LESS trials were recorded by a Kinect sensor (ie, depth camera, version 1; Microsoft Corp)28,29 connected to a standard laptop computer that ran Athletic Movement Assessment software (PhysiMax Technologies Ltd). The software automatically scored 16 of the 17 LESS items. The final item, overall impression, was scored by the primary author (T.A.C.). The software demonstrated good reliability (average Prevalence and Bias-Adjusted Kappa-Ordinal Scale = 0.71 ± 0.27) compared with expert consensus scores.28 Based on their LESS scores, athletes were placed into 1 of 2 movement quality groups: good (≤4) or poor (≥5).24
Statistical Analysis
If an athlete missed at least 6 weeks of participation for any reason (eg, injury), his data were excluded from further analysis. Multiple observations of participants (ie, 20 participants were observed in season 1 and season 2) violated assumptions of data independence; therefore, we conducted nonparametric analyses. To determine if our dependent variables (absolute and relative training loads) differed over time (independent variable), we calculated the Friedman test. Post hoc comparisons were run when appropriate to determine the location of statistically significant findings. A Cox proportional hazard regression model was applied to determine if movement quality moderated the relative training load over time. More specifically, we assessed whether movement quality could predict if an athlete would experience a relative training load magnitude of ≥1.516 at some point during the season. An a priori α level of .05 was used for all statistical analyses.
Playing status (ie, starter versus reserve) and position may moderate measures of training load,17,21,22 but our study was not powered to include these variables as covariates in our primary statistical models. However, we evaluated differences in the average absolute and relative training loads during a season between starters and reserves and among playing positions by using the Mann-Whitney U test and the Kruskal-Wallis test, respectively. An α level of .05 was also used for these secondary analyses.
RESULTS
A total of 32 (62%) athletes sustained some level of low back or lower extremity injury during the 2 seasons in question. Six athletes were excluded from further analysis due to missing ≥6 weeks of time during a season. Of the participants retained for further analysis, most were considered poor movers (n = 29, LESS score = 6.13 ± 1.24), and the remaining 17 were good movers (LESS score = 3.35 ± 0.70). Among both good (10/17) and poor (17/29) movers, 58% sustained an injury.
Across the entire cohort, the absolute and relative internal training loads differed across weeks of the season (P values < .001). Mean weekly absolute and relative training load values for the entire cohort and by movement quality are shown in Table 3. The Figure illustrates the average absolute (bars) and relative (line) training load for the combined cohort across the 2 competitive seasons. Movement quality did not moderate the relative training load (P = .264). The resulting hazard ratio was 1.29 (95% CI = 0.83, 2.01).
Starters (2133.10 ± 229.53 AU) reported a higher season-long average absolute training load than reserves (1781.56 ± 213.42 AU; P < .001), but the relative training load did not differ between groups (starters = 0.98 ± 0.08, reserves = 0.98 ± 0.06; P = .709). The season-long average absolute training load (goalkeeper, 1842.07 ± 151.13; defender, 2028.31 ± 277.83; midfielder, 1924.72 ± 347.06; forward, 1946.32 ± 212.36) did not differ among playing positions (P = .426). Similarly, the season-long average relative training load (goalkeeper, 1.01 ± 0.03; defender, 0.94 ± 0.03; midfielder, 0.98 ± 0.08; forward, 0.98 ± 0.07) did not differ by position (P = .652).
DISCUSSION
The purpose of our investigation was to determine how absolute and relative internal training loads changed during a men's collegiate soccer season and how movement quality moderated an individual's relative training load. Our results demonstrated differences in absolute and relative training loads during a season, which supported our hypothesis and were consistent with the existing literature.17,18,20,22 However, contrary to our hypothesis, movement quality did not affect relative training load in our cohort of men's Division I collegiate soccer players.
Training Load
Research on absolute and relative training loads has focused on both professional and collegiate athletes.10,12,13,15,17–22 The absolute training load we observed appeared consistent with the values previously reported in collegiate soccer players, but direct comparisons should be made cautiously as key descriptors (eg, average daily versus average weekly absolute training load) are not explicitly stated in earlier studies. For example, Huggins et al18 found an sRPE value of 600 AU at their preseason assessment. If 600 AU is assumed to be a daily average, then the weekly absolute training load (600 × 7 days = 4200 AU) would be consistent with the week 1 value (approximately 4500 AU) we demonstrated. Similarly, Walker et al20 noted approximately 3600 AU at their initial assessment and decreasing values throughout the season (approximately 1750 AU at final assessment) in female collegiate soccer players. These values are consistent with our results except for our spikes in weeks 13 and 14. However, our values are much higher than those of Ryan et al,21 who reported an average of approximately 1000 AU during the first 3 weeks of the season, which persisted throughout their chosen time periods (ie, blocks of weeks). Averaging across weeks of play may explain some of the differences, but coaching styles, team form, and the competitive level of the team may also play roles. Therefore, it is important to interpret the results of studies that captured data from a single team, including ours, with caution.
Interestingly, professional soccer players15 who experienced a 1-week absolute training load of ≥1500 to ≤2120 AU in their preseason were at higher risk of injury than those who experienced ≤1500 AU. Our mean week 1 absolute internal training load of approximately 4500 AU was triple the threshold identified for an increased injury risk in professional soccer players. Furthermore, 100% of our participants experienced loads that were greater than the lower limit of that threshold.15 Although subsequent injury patterns are beyond the direct scope of this investigation, the high loads experienced may explain why approximately 58% of both good and poor movers sustained injuries during the study period. Furthermore, season duration is a factor that should be considered when comparing these early-season internal training load values between collegiate and professional athletes. Professional soccer seasons may last 9 to 11 months, whereas NCAA seasons are less than 5 months long. The shorter NCAA season likely places a greater emphasis on early fitness and explains the high internal training loads observed here and by previous authors.18,20 Also, NCAA rules prohibit student-athletes from participating in team activities, which includes the reporting of training load data, before the official season start date. As a result, we were not able to calculate a relative training load until week 5 of the season. Thus, we could not identify a relative training load spike in the first weeks of the season, as is typically present in professional soccer players.13,30
The lowest observed training load occurred during the middle segment of the season and may have reflected the smaller number of days between matches within weeks 7 to 12 (3.15 ± 1.27 days) of the season relative to the late-season segment (ie, weeks 13–18), which had the most days between matches (7.40 ± 3.37 days). This finding is consistent with previous results22 indicating that within collegiate soccer, <4 days between matches resulted in smaller training loads relative to ≥4 days between matches. We speculate that fewer days between matches results in shorter or less intense training sessions to maximize performance in games. On the contrary, our late-season spike in absolute and relative training loads was likely due to identical (1) 8-day breaks between the final regular season game and the conference tournament and (2) a first-round conference tournament loss that led to a roughly 2-week gap before the first round of the NCAA tournament.
Match congestion (ie, scheduling) has been the focus of several groups.19,31–33 For example, Dupont et al31 found that 3 to 4 days between matches was enough to maintain physical performance in youth soccer players and that sRPE did not change during a match-congested schedule (ie, a schedule with <1 day between matches). Conversely, Rabbani et al19 noted an increase in absolute training load during a match-congested schedule relative to a standard training-only schedule. Although match congestion may be detrimental, our data suggested that larger breaks between matches may have a greater effect on training load spikes.
Movement Quality
As we are the first to examine whether movement quality mediates internal training load, it is difficult to contextualize the results. Of the 46 participants analyzed, most were identified as poor movers (63%). This percentage is much higher than the 38.3% of 827 youth soccer athletes (age = 13.9 ± 1.8 years; 42% males, 58% females) Padua et al24 identified as poor movers. The inability of movement quality, as assessed via the LESS, to moderate internal training load may be due to a number of factors. First, a jump-landing task may not be sensitive to soccer-specific movements and, thus, the internal training load experienced by soccer players. Second, the short season duration and homogeneity of load for NCAA athletes (eg, league play followed by tournaments) may also limit the ability of movement quality to moderate the internal training load in male collegiate soccer players. Although this is speculative, movement quality may be a more effective moderator in professional settings, where the season duration is longer and athletes' loads are more heterogeneous due to league, concurrent tournament, and potentially international play. It is also possible that movement quality changes throughout a season and, therefore, our assessment of movement quality at the beginning of the season did not represent how movement quality may moderate the internal training load throughout the season.
Clinical Implications
Movement- and training-specific variables are important factors related to injury risk, but collegiate athletes spend most of their time outside athletics (ie, maximum of 4 hours per day on team activities).34 Hence, other factors, such as sleep hygiene and mental fatigue, which have been linked to general declines35 and soccer-specific performance,36 should be considered by future researchers in relation to internal training load and injury risk. Movement quality, as assessed via the LESS, did not moderate the training load in our cohort, yet movement quality and internal training load can be valuable data points for coaches and sports medicine personnel. Given the homogeneity of training load among a team and NCAA rules, the specific absolute and relative training load cut points used by previous researchers may not be the most appropriate metrics for clinical use. Instead, it may be more pertinent to focus on an individual player's sRPE relative to the sRPEs of teammates21 of similar playing status.17 Thus, players who report higher than normal sRPE values relative to those of teammates of similar playing status would be flagged for follow up. Such an approach would emphasize individual variability and change in determining how measures of training load could be used to gauge the level of injury risk.
LIMITATIONS
Several limitations of this investigation should be noted. First, we examined a small sample from a single Division I men's collegiate soccer team. Therefore, our results may not translate to female collegiate athletes or soccer players of different ages. Second, only 1 movement quality assessment tool (ie, LESS) and 1 marker of training load (sRPE) were used. Other tools (eg, Fusionetics) and markers of training load (eg, wellness inventories) should be assessed in future studies. Additionally, because of the small sample size and lack of statistical power, we did not account for how factors such as playing status (ie, starter versus reserve) and playing position may have moderated the absolute and relative training loads in our statistical models.
CONCLUSIONS
Differences were present in absolute and relative internal training loads across weeks of an NCAA Division I men's soccer season. Most of the cohort was classified as poor movers (63%) and sustained an injury (62%) during the study period. However, movement quality, as assessed via the LESS before the season, did not moderate relative internal training loads across a competitive season in collegiate male soccer players.