CROTIN, RYAN L.1,2; KOZLOWSKI, KARL1,3; HORVATH, PETER1; RAMSEY, DAN K.1
Tracking pitch counts in baseball, long used as a surrogate measure of physical effort or workload, is often used to infer generalized physiologic fatigue. In sport research, physiologic fatigue is often reported as a neuromuscular, immunological, or biomechanical response (1,4,6,11–13,15–17) or is subjectively evaluated (6,12,13,17,19). Yet, pitchers’ roles and workloads vary, as evident by starting pitcher versus relief pitcher roles and pitching workload characteristics, including pitch count, pitch type (fastball vs off-speed), appearances, and recovery, as well as innings pitched. Relief pitchers generally throw a greater proportion of high-velocity pitches with pitch counts much less than 80 facing anywhere from a single batter to one to two full innings, two to three times per week, and sometimes in consecutive game appearances. Similarly, relief pitchers may throw with greater effort earlier during the warm-up period in the bullpen. Starting pitchers may throw complete games (nine innings), accumulating well over 100 pitches at various velocities. Seasonal accumulation, as well as nongame throwing (i.e., long toss or sideline bullpen sessions) should also be factored in repetitive stress estimations for both pitching populations. This suggests workload may not be simply proportional to pitch counts, and therefore, the criterion by which pitching workload is assessed may be misrepresented. Consequently, exertion may not be generalizable to pitch count and therefore may not be a viable predictor for injury risk (3,8,18).
Analysis of fastball velocity trends during a game is an alternate measure to quantify pitching exertion.(2,6,11,14,15,18). Increased pitch workload (innings and pitches thrown) has been associated with diminished fastball velocity, and although not reproducible, altered shoulder, trunk and lead knee biomechanics have been reported (1,6,16). Functional muscle fatigue evidenced by rotator cuff and scapular stabilization strength deficits have been associated with declines in average fastball velocity, with reductions between 2 and 5 mph (11,15). Reduced fastball velocities may suggest the onset of protective mechanics, where altered throwing arm kinematics and kinetics may decrease shoulder and elbow tensile stress, which is thought to exacerbate repetitive stress injury (1,6,16).
Over the course of a game, altering pitching mechanics may be a reactive response requiring coordinated biomechanical changes to maintain or increase fastball velocities in the face of fatigue. The ability to receive instant velocity feedback in competition may also be a precipitant to adapt biomechanics when faced with fastball depreciation. It is suggested that most compensations result from tissue microtrauma and elevated physiologic costs (overexertion), which have been related to increasing workloads (pitch counts) and pitch frequency (innings, games) (3,6,11,15,16,18). Whereas declines in ball velocity may be considered protective, altering throwing mechanics in response to overexertion and pain may increase throwing arm stress to maintain performance (3,6,11,15,16,18). When compensatory throwing mechanics amplify the magnitude, rate, and frequency of loading, as well as altering joint reaction forces, the internal soft tissues may be overwhelmed (10,21). A 36-fold increase in injury risk has been reported in pitchers who competed in a “fatigued” state (17). The evidence suggests competing while overexerted likely increases repetitive stress magnitudes for the throwing arm, which may be linked to biomechanical compensations.
Compensatory lower body mechanics attributable to altering stride length can affect forward propulsion and braking and may be a better indicator of overexertion that can be monitored. To our knowledge, altered stride length as a biomechanical response to overexertion has never been examined. In baseball, stride length is defined as the calcaneal distance between the drive ankle at peak knee height (PKH) during the wind-up to the stride ankle at stride foot contact (SFC). For this investigation, we examined whether varying stride length affected physiologic outcomes and ball velocity. As games progress, we propose pitchers who throw at stride lengths greater than 75% body height will experience greater self-reported measures of lower body exertion with increases in HR and metabolic response. In contrast, when pitching at stride lengths less than 75% body height, we believe self-reported lower body exertion measures will be reduced along with physiologic measures. In both instances, we believe fastball velocities will be maintained, thereby demonstrating stride length variation as a plausible compensatory adaptation to overexertion.
A total of 19 collegiate and highly skilled high school pitchers (15 right and 4 left handed) were recruited from local collegiate and travel baseball programs using flyers and personal contact (height, 1.84 ± 0.054 m; mass, 82.14 ± 0.054 kg; age, 18.63 ± 1.67 yr). All were competitive for at least five seasons and were uninjured at the time of testing, and none had preexisting throwing arm injuries that required surgery. Testing was undertaken indoors in a biomechanics laboratory. All participants signed an informed consent. Parental consent was signed and obtained for minors. The research study was approved by the University at Buffalo’s Children and Youth Institutional Review Board.
A blinded randomized crossover design was used to assign pitchers to throw two simulated games, beginning with either i) a 25% increased stride length (overstride (OS)) or ii) a 25% reduced stride (understride (US)) from their desired stride length (DSL). Persons were crossed over to the alternate condition after a minimum of 72 h of rest had elapsed. Allocation was determined by simple randomization from a random numbers table. Three dimensional movement patterns and ground reaction force measures of the drive and stride leg were obtained using an eight-camera VICON motion analysis system (Oxford Metrics, UK) integrated with two Kistler force platforms (Kistler Instrument Corp., Amherst, NY), sampling at 240 and 960 Hz, respectively. A total of 70 retroreflective markers were used to distinguish body segments and to define their joint centers and segmental coordinate systems (Table 1). A rigid thermoplastic shell affixed with three noncollinear markers was secured over the sacrum using Velcro and elastic overwrap (SuperWrap™; Fabrifoam, Inc., Exton, PA) around the waist to track pelvis movement. Reflective tape was secured to the baseball to visualize ball release (BR).
Before motion capture, the measurement volume was dynamically calibrated before each simulated game (Vicon Nexus, Oxford Metrics). The global reference system was oriented with the +X axis mediolateral, the +Y axis directed anteriorly in the direction of the throw, and +Z axis superior. Static standing calibrations were then recorded before movement trials.
Pregame warm-up and velocity recordings
After removal of the anatomical markers, pitchers warmed up by throwing 30–40 pitches into a catch net (Rawlings Group, St. Louis, MO) at a distance of 5.69 m. The first 25 pitches were thrown at the DSL, whereas the remaining pitches were thrown with compensated strides. Baseballs were thrown directly toward a radar gun accurate to within ±0.5 mph/±0.80 km·h−1 (Jugs Sports, Tualatin, OR) positioned behind the net at a height of 1.02 m to best track ball velocity. Ball velocity was relayed to an LED display (Jugs Sports) for instantaneous feedback to ensure fastballs were thrown maximally.
Stride length determination preceding and during simulated game conditions
Motion recordings and ball velocities were obtained between the 20th and 25th warm-up pitches while throwing at 100% effort at the DSL. Kinematic and kinetic data were visually inspected using Vicon Nexus software (Oxford Metrics). PKH was identified as the highest vertical displacement of the suprapatellar marker during the wind-up phase and SFC when pitchers contacted the opposing force plate. The two fastest pitching trials within the 20th to 25th pitch were used to derive pitchers’ DSL, calculated as the distance between the drive foot calcaneus at PKH and the stride foot calcaneus at SFC (Fig. 1), and then averaged. The DSL was either increased or decreased by 25% respective to the pitching requirements. Areas over the force platforms were marked to indicate drive foot and stride foot placement for both the OS and US, where participants were encouraged to contact the targets during the simulated games. Ample warm-up pitches before motion recordings were provided to acclimatize pitchers to the OS or US conditions. Twenty pitches were thrown per inning with a ratio of three fastballs to one change-up during simulated play. Approximately 15-s rest was allocated between pitches with 9 min rest prescribed between simulated innings. Five warm-up pitches were allocated before each simulated inning. Testing ceased after the 80th pitch. In total, each pitcher threw approximately 130 pitches per simulated game.
Data management and postprocessing of kinematic and kinetic data
Visual 3D software (Visual 3D; C-Motion Inc., Rockville, MD) was used for postprocessing kinematic and kinetic data. Marker trajectories and ground reaction force data were filtered using a second-order bidirectional Butterworth low-pass filter at 13.4 (5,7,9,10) and 40 Hz, respectively. Ground reaction force data were normalized to bodyweight for both drive and stride legs, from which SFC was determined when the leading foot exceeded a 5% body weight vertical ground reaction force when contacting the second force plate (Fig. 1). BR and linear hand velocity as identified by visual inspection of the kinematic data was used to terminate the pitching cycle. PKH, SFC, maximal external shoulder rotation, and BR provided hallmark events in each trial, from which the pitching cycle was time normalized from PKH to BR.
Before each test, a 9-min resting baseline HR (BHR), salivary cortisol (C), salivary alpha amylase (AA), self-reported exertion scores (SES) indicating lower body and throwing arm exertion, as well as baseline blood glucose and baseline lactate (BLA) measures were collected. To reduce contamination of salivary hormones, subjects abstained from teeth brushing and only drank water 1 h before participation. No caffeine was permitted. SES is a self-report measure of perceived exertion that consists of a 10-cm horizontal visual analog scale, where “0” represents no exertion and “10” maximal exertion. Exertion status was assessed for both the throwing arm and legs, with baseline measures obtained before testing, and response measures were taken immediately after pitching at the beginning of each rest period. Overall scores were represented as percentages, with higher scores indicating greater self-perceived exertion.
Polar RS300X HR monitors (Polar, Finland) recorded the pitching HR after every fifth pitch (four measures per inning) and at 30-s intervals over 9-min recovery periods. BHR, pitching heart rates (PHR) per inning, end of inning heart rates (EIHR) between innings, pitching intensities (PI) per inning, and recovery capacity (RC) between innings and postgame were recorded. PI was calculated as a percentage of pitchers’ HR reserves. RC expressed the last HR measured at the end of each inning and conclusion of each game as a percent difference from the BHR. An ensemble average was calculated across all innings to derive game means for all HR variables, as well as SES. The formulas below define PI and RC:
a) PI = [(PHR − BHR)/(HRR)] × 100; *HRR = HRmaximum(220-age) − BHR
b) RC = [(EIHR − BHR)/BHR] × 100
Salivary C and AA samples were collected at baseline from sublingual oral swabs (Salimetrics, State College, PA). A second oral swab was placed over the parotid gland in the mouth at 5 min postpitching to collect AA, and a third oral swab was placed sublingually at 15 min postpitching to extract postpitching C. All samples were collected for 3 min and then placed in a collection tube and stored on ice. Samples were spun at 3000 rpm in a centrifuge at 8°C for 15 min. Salivary specimens were aliquoted and stored in labeled 100-μL cryovials and frozen at −80°C. Specimens were shipped to a commercial laboratory (Salimetrics) for analysis. Baseline and postgame measures of baseline blood glucose and BLA required blood lancets, capillary tubes (Bayer, PA), separate collection strips, and metabolite analyzers (Roche).
Mean fastball velocities per game and per inning, linear throwing hand velocities at BR, and physiologic measures for both OS and US conditions were analyzed. Of the 80 pitches thrown during simulated play, the two highest velocity pitches that were identified from radar data during the first and last innings were selected, from which mean peak linear throwing hand velocity at BR was derived. Independent t-tests were used to assess differences in fastball and throwing hand velocity and included all physiologic measures between OS and US pitching. Repeated-measures ANOVA compared incremental HR and SES between the first to last innings to delineate whether stride length affected exertion within each condition. A Bonferroni correction factor was used for multiple comparisons. Pairwise t-tests were used to compare postpitching responses to baseline measures for salivary and blood metabolites. Statistical significance was determined a priori and set at P ≤ 0.05 for all statistical tests. All statistical analyses were performed using SPSS 19 (SPSS Inc., Chicago, IL).
When normalized to body height, DSL was determined to be 67% of total body height, whereas OS and US were at 76% and 52%, respectively, (Fig. 2). Table 2 summarizes the physiologic responses relative to stride length and self-report ratings of exertion for the throwing arm and legs. Overall, cardiovascular responses differed within and across pitching conditions. For US, pairwise comparisons of pitching HR was found to progressively decrease across innings, as evident between the first and second inning, between the second and third inning, as well as the third and fourth, all of which achieved statistical significance (P < 0.001). The mean pitching HR was statistically lower from the first inning to the last (P < 0.001), evidenced by an 11.1-bpm reduction (128.1 to 117 bpm). However, PHR during OS pitching remained consistently elevated at approximately 125 bpm across all innings.
PI, as indicated by %HRR, was significantly lower in US pitching by 8.9% between the first and last inning. US incremental effects revealed significant decreases in PI between innings 1 and 2 (P = 0.001), as well as between innings 3 and 4 (P < 0.001). OS saw equivalent PI throughout the simulated game, averaging 37.6%. End of inning HR measures (final 30 s) taken during the first and last inning recovery periods were shown to decrease by as much as 7.4 bpm in US (P = 0.005). Greater percent increases expressed the degree to which the end of inning HR exceeded the BHR, therefore indicating a reduced ability to recover. US pitching revealed greater overall recoverability versus OS pitching (US 9.04% vs OS 14.8%, P = 0.012). The greatest disparity in RC found between groups occurred after 60 pitches thrown (US, 3.61%, vs OS, 13.3%, P = 0.029).
Although the mean SES for the throwing arm was equivalent between stride conditions, within-group ratings were found to statistically and progressively increase from the first to the last inning, with the highest exertion occurring during the final inning irrespective of stride length (P < 0.001). An across group comparison of lower body exertion scores exhibited significantly lower scores for the US condition compared with OS, but only for the third inning (P = 0.02). Within-group analysis found significantly greater perceived lower body exertion for the US group only between innings 1 and 2 (P < 0.001), whereas the OS condition saw increases between innings 1 and 2 (P < 0.001) and between the second and third inning (P = 0.005).
Evident from postgame analysis of blood glucose concentrations was that levels were statistically lower for both OS (18.4 mg·dL−1) and US (12.2 mg·dL−1) conditions when compared with baseline (P = 0.007). Similarly, blood lactate exhibited a significant reduction from baseline for OS (0.53 mmol·L−1, P = 0.018), but it remained unchanged within US pitching. Salivary AA was unchanged in US, but it was statistically higher within OS pitching compared with baseline (150.8 to 245.4 U·mL−1, P = 0.011), whereas salivary cortisol was statistically lower compared with baseline while US pitching (0.290 to 0.134 μg·dL−1, P = 0.001).
Throwing hand and fastball velocity characteristics are listed in Table 3. No significant differences were observed, meaning that stride length compensations did not affect the mean or peak fastball velocities between innings.
Professional and collegiate baseball has adopted ball velocity and pitch counts as surrogate measures of physical exertion, although the efficacy and validity of these traditional surveillance strategies are questionable. Our hypothesis confirmed that compensatory lower body biomechanics, evident as altered stride lengths, can reflect overexertion without affecting the peak and average ball velocities. We introduce the concept of screening biomechanical variations and adaptations as factors influencing physiologic stress associated with traditional workload measures (games played, innings pitched, and pitches thrown).
We suggest that pitching mechanics are not energetically equivalent within and between pitchers, where even slight variations may affect cardiovascular, adrenal, cognitive, and neuromuscular responses. Confirming our hypothesis, elite pitchers adapt their mechanics to maintain ball velocity while reducing physiologic demands. Our investigation found stride length compensation (changes in drive and stride foot calcaneal distances) as a plausible response to overexertion, where compensations can be subtle and ongoing throughout games without affecting pitch velocity measures. We have shown that skilled baseball pitchers can maintain average and peak fastball velocities within a 45-cm change in stride length (Fig. 2).
Compensatory strides varied proportionately to pitchers’ body heights and revealed different physiologic consequences if pitchers exceed or shorten stride lengths from a desired stride. Shortened strides reduced PHR, PI, and EIHR, collectively improving RC between innings. Salivary responses indicated that AA, an analyte used to infer systemic epinephrine, was unaffected by stride length reductions, and C, a secondary measure of adrenal stress, actually reduced from baseline with reduced stride length.
Increased stride length appears to be more physiologically taxing because increased adrenal output inferred through significant AA elevations was observed from baseline. Contrary to reduced stride length, games simulating elongated strides presented elevated cardiovascular responses for PHR, PI, EIHR while RC was reduced. Pitching at greater strides lowered BLA from baseline, and stride length shortening revealed no change, indicating that stride length variance does not cause lactate to efflux into the bloodstream.
SES can be a powerful measuring tool estimating overexertion in competition. Our visual analog scoring method was used to demonstrate if pitchers could intrinsically determine their own level of regional exertion (throwing arm and lower extremities) associated with changes in stride length. SES demonstrated incremental exertion effects for throwing arm, as perceived ratings escalated in parallel to pitches thrown and innings pitched. We believed that reduced stride lengths would increase throwing arm exertion, potentially encouraging pitchers to increase strides. Our hypothesis was not supported because throwing arm exertion patterns were not distinguishable between stride length conditions.
Pitchers in this study intrinsically interpreted differences in lower body exertion between conditions, because self-reported exertion was greater for OS pitching after 60 pitches were thrown (OS, 35.2%, vs US, 22.0% exerted). No further increases in exertion were present for the fourth inning of play, representing 61–80 pitches thrown. This suggests that stride length compensations in starting pitchers may occur much earlier, because exertion can be sensed before the current 100-pitch workload limit in professional baseball. Pitchers on reduced pitch counts such as relievers (pitchers throwing less than 40 game pitches) may not cognitively perceive lower body exertion differences despite ongoing stride length compensations. Because PHR, PI, EIHR, and RC responses differed early within simulated games, it is possible that internal physiologic signals can evoke compensatory strides before intrinsic exertion detection arises.
Supported by physiologic measures and self-reported scores, even slight stride length reductions can offer suitable adaptive strategies to decrease perceived exertion from the elevated physical effort associated with elongated strides. Ultimately, physical conditioning and frequent range of motion assessments have tremendous roles in maintaining stride length consistency for pitchers with large strides relative to body height (greater than 75% BH), because compensation may occur long before 80 pitches and potentially last longer than 120 pitches thrown.
We indicated that greater stride lengths can be physiologically more demanding and can encourage stride length reductions. Although the absence of a pitching mound may be a potential limitation of our study, the radar gun angle was used to best represent the pitching delivery at full distance. Regardless of the pitching surface, our research provides a model to explore how baseball pitching physiology can be influenced by changes in stride length. Pitchers throwing from a mound present greater stride lengths relative to body height (approximately 80%–86% BH), in comparison with our OS pitching population (76% BH) (5,7). Given the inherent differences in stride length at foot strike when pitching from a mound or level ground and the intuitive disparities in kinematics and kinetics between the two conditions, pitchers are susceptible to lower body overexertion impacts, which can affect consistency in DSL and interfere with accuracy and velocity performance. It is also important to note that exertion does not have to be the only influence inducing compensation, because skilled pitchers reduce strides when throwing curveballs (5,7,10,19,20).
Pitchers with improved skill level differ from youth pitchers, because greater anthropometry, pitch type variety, and competitive pressure most likely enhance one’s ability to adapt his or her mechanics to exertion (5,7,9,10). The element of competitive pressure inclines us to believe that professional pitchers regularly adopt compensatory mechanics where increased exertion and psychological stress are peaked in professional competition. Game performance at the professional level is tied to many economic entities (i.e., arbitration, endorsements, etc.) where compensatory adaptations maintaining ball velocity can prolong innings pitched, which thereby can improve competitive statistics. Competitive statistics are associated with attainment and consistency of peak ball velocities, because velocity differentiation (difference in velocity from peak fastball to minimum off-speed) can increase deception to opposing batters. When velocities are maximized, time is reduced for hitters to predict pitches thrown in evoking a reaction. Professional pitchers form a special population that receives continual velocity feedback in games, as almost all professional baseball stadiums have radar gun displays. The ability to receive real-time velocity feedback further reinforces the opportunity for compensatory biomechanics to arise. We believe that subtle stride length compensations have been completely unnoticed in professional baseball, where radar velocity feedback can guide adapted movement strategies to overcome ball velocity depreciations over the course of games.
Future work assessing traditional workload metrics (games played, innings pitched, and pitches thrown) may reveal weak associations with injury, where consistent biomechanical screenings offer desired orthopedic health outcomes through predicting pathomechanic, adaptive movement strategies. In Major League Baseball, most preventative training strategies are focused on the musculature of the throwing arm, conditioning joint interactions about the distal kinetic chain (scapulohumeral, radiohumeral, humeroulnar, radioulnar, and wrist). Overemphasized distal chain training may not decline injury rates, where greater attention to strength and conditioning of the proximal kinetic chain (ground–ankle–knee–hip–torso) can minimize overexertion-related injuries caused by altered proximal-to-distal momentum sequences.
The research team extends their gratitude to Jugs Sports for providing the professional radar gun and LED display.
We would also like to thank our research associates; Jennifer Martins, Philip Mathew, Joseph Westlake, Alyssa Herman, Connor McNally, and Laura Dipasquale for recording ball velocities and assisting in physiologic data collections. We also sincerely thank Shivam Bhan, Michael Stangroom, Zach Anderson, and John Rusin for their assistance in establishing the motion capture environment and pitching model. Internal funding was provided by the University at Buffalo.
Internal funding was received for this work from the University at Buffalo. No external funding or benefits were received or will be received from any commercial party related directly or indirectly to the subject of this article.
None of the authors report a professional relationship with a company or manufacturer who will benefit from the results of the present study.
Results of the present study do not constitute endorsement by the American College of Sports Medicine.