A high incidence of overuse injuries of the lower extremity has been reported for military recruits (10). The use of shock-absorbing insoles has been identified as one potential method for reducing lower-extremity overuse injuries (13,16,23). It has been suggested that these insoles protect against injury by reducing the magnitude and rate of initial loading at heel strike and by redistributing the ground reaction force across the foot plantar surface, thereby reducing the loads transmitted to the skeletal system (17). This suggestion has been supported by Windle et al. (24), who reported that, when new, shock-absorbing insoles attenuate the peak pressures at the boot–foot interface at heel strike and during forefoot loading.
Mechanical tests of insole materials have found that repeated loading simulating that which occurs in running results in an increase in mechanical stiffness (11). This implies a reduction in the ability to reduce heel impacts when previously used insoles are placed in footwear. However, the subsequent influence of insole degradation on loads occurring during running has not been assessed. Previous studies have shown that the performance of insoles in material tests is not directly related to their performance when used by humans (17). This has been attributed to human adaptations, such as changes in joint angles and muscle activity, affecting lower-extremity stiffness (3,9). It is therefore necessary to assess the impact-absorbing ability of degraded insoles during running before recommending an insole for routine use by recruits.
In addition to potentially influencing ground reaction forces during running, the use of insoles may positively affect lower-extremity kinematics. Biomechanical evidence indicates that inadequate cushioning at the foot-shoe interface results in compensatory adjustments in lower-extremity kinematics, resulting in a reduction in lower-extremity stiffness (3,9). Hence, running in military footwear without the artificial cushioning provided by a shock-absorbing insole may result in unnecessary and possibly excessive joint movements of the lower extremity, which may be associated with the occurrence of lower-limb injuries. There is also evidence that the heel raise provided by some insoles may influence lower-extremity kinematics. In particular, the peak ankle dorsiflexion angle has been found to be reduced by heel lift, resulting in a suggested reduction in Achilles tendon strain (7). Study of the influence of in-shoe devices should therefore include measurement of kinematic effects.
The aim of the present study was to establish which of four shock-absorbing insoles provides the most impact absorption in running after mechanical degradation simulating 100 km of running impacts. The insole samples were chosen based on the findings of an earlier study (24). It was hypothesized that each of the shock-absorbing insoles when new would reduce peak impact loading compared with a no-insole condition. However, a reduction in this shock-absorbing ability was expected after mechanical degradation. To assess the validity of the mechanical degradation, the influence of human wear on insole mechanical properties was also investigated. Additionally, the influence of the insoles on sagittal plane kinematics was assessed to identify any changes in peak joint flexion.
The experimental procedures were approved by the Ministry of Defense (Navy) Personnel Research Ethics Committee. The study was conducted in three phases: laboratory mechanical testing and degradation of insoles; biomechanical assessment of new and mechanically degraded insoles; and field testing of insoles. Four insole types were assessed. Insole A was a molded polyurethane foam footbed (ShoreO 40), approximate thickness 6 mm in the heel and 3 mm in the forefoot. This insole had a 1-mm layer of polyurethane elastomer (ShoreO 45) inserted in the heel and ball areas of the foot. Insole B was composed entirely of the same polyurethane elastomer as found in insole A and had an approximate thickness of 3 mm. Insole C was composed of a 6-mm thick polyurethane foam (ShoreO 22) shaped by attachment to a high-density ethyl vinyl acetate (EVA) footbed (ShoreO 80). Insole C had a maximum heel height of 8 mm and forefoot thickness of 4 mm. Insole D consisted of a 3-mm base of coarse weave plastic with a top sheet of nylon nonwoven fabric (ShoreO 60). This insole is the standard issue insole provided to Royal Marine (RM) recruits and is not aimed specifically at providing cushioning but is designed with the intention of improving ventilation within the boot and providing thermal insulation and also as a removable material for hygiene purposes.
A total of 84 RM recruits volunteered for the study, and their written informed consent was obtained in accordance with the Helsinki Declaration (25). Sixteen recruits participated in the biomechanical assessment of new and mechanically degraded insoles, and 68 recruits in the field assessment of insoles during the final 3 wk of their 30-wk training course. The 16 subjects employed in the biomechanical study had a mean mass (including added mass carried in a vest) of 93.3 (SD 8.5) kg and mean age 20 (SD 2) yr.
Laboratory degradation and mechanical testing of insoles.
The stiffness characteristics of the insoles were quantified using an Instron dynamic testing apparatus (Instron model 1125, Canton, MA). A nominal pressure of 500 kPa, reaching a peak value 50 ms after initial contact, was repeatedly applied to the heel of the insoles at a frequency of 1 Hz. This load and frequency are representative of a typical running stride (6). The Instron apparatus measured the load exerted on the insole using a hydraulically controlled load cell and the insole deformation using a displacement transducer, with 40 data points recorded per impact cycle. These values were then plotted to give a load-deformation curve, the average gradient during loading being used to represent the stiffness of the insole material. The heel areas of the insoles were mechanically degraded using the Instron apparatus over a circular area of diameter 40 mm. The insoles were subjected to 100, 1,000, 10,000, 40,000, 60,000, and 100,000 impacts, simulating approximate running distances of 0.25 km, 2.5 km, 25 km, 75 km, 100 km, and 250 km. For each series of impacts, data from the final five impact cycles were recorded and the mean stiffness of the material calculated over these trials.
The impact-absorbing ability of new and mechanically degraded insoles and samples of insoles used in the field study (after being worn) was assessed using a standard impact test procedure. The materials were stored for at least 24 h at 20°C before testing. An 8.5-kg mass was released to impact with the center of the heel of each insole sample placed on top of the sole of a combat assault boot. A drop height of 50 mm provided an impact velocity of approximately 1 m·s−1, corresponding to typical heel impact velocity occurring during running (6). Peak deceleration of the mass during impact was recorded and the shock-absorbing ability of each sample was represented as multiples of the acceleration due to gravity (peak g). After five preimpacts, the mean of the subsequent five impacts was used to provide the peak g value for each sample.
Biomechanical testing of new and degraded insoles.
Biomechanical data were collected in a covered hangar with an asphalt floor. Subjects wore military footwear (combat assault boots). A weighted vest, designed to carry ammunition, with pouches around the waist was also worn. During the data collection, as during training, recruits filled the pouches with stones to provide a mass of at least 10 kg. The boot alone was used as a baseline condition against which the influence of new and mechanically degraded insoles was tested, resulting in a total of nine running conditions. The degraded insoles had been mechanically impacted in the heel area 40,000 times (100 km) as described above. The subjects undertook the nine conditions in accordance with a randomized design and were not explicitly informed of which insole they had in their boots at any time.
For each running trial, subjects ran approximately 10 m at 3.3 m·s−1 (±5%), with average running velocity monitored using photocells. The subjects were required to make right foot contact with a force platform (AMTI OR6-5, Watertown, MA) sampling at 1000 Hz sited in the 10-m runway flush with the running surface. The subjects practiced running down the runway before the data collection commenced, allowing familiarization with the required speed and stride pattern. Ten successful running trials were performed per condition and force data recorded for one running step (right foot) per trial. Trials were repeated if the required running speed was not attained, or the subject made an obvious adjustment in running stride to make contact with the force platform. For each running trial, the peak impact force, time of occurrence of peak impact force (from initial ground contact), average rate of loading of impact force, peak instantaneous rate of loading of impact force, peak active force, and time of occurrence of peak active force were determined.
Two-dimensional sagittal plane kinematic data were collected at 50 Hz for eight subjects using a Panasonic (AG-455MB, Osaka, Japan) video camera. Skin markers were placed on the lateral right lower limb to represent the hip, knee, ankle, and fifth metatarsal-phalangeal joint centers and at a point on the heel (Fig. 1). Data were recorded for one step per trial, corresponding with the force plate contact step. The kinematic data were digitized using Peak Motus software (Peak Performance Technologies, Denver, CO). Foot, ankle, knee, and thigh angles for the frame immediately before ground contact and the peak ankle and knee joint flexion angles during ground contact were identified. Kinematic data were synchronized with force data using an LED in the field of view of the camera and triggered by a vertical impact force exceeding 10 N.
Field testing of insoles.
To assess the validity of the mechanical degradation, 68 recruits were randomly assigned a new pair of one of the four types of insole (insole A, 22 recruits; insole B, 18 recruits; insole C, 18 recruits; and insole D, 10 recruits). The recruits were asked to wear the insoles in their boots for all the activities they undertook over a 3-wk period during which it was estimated from the training syllabus that a total marching and running distance in excess of 126 km would be covered. They were instructed to remove the insoles if they experienced pain or blisters they felt were attributable to the insoles. Upon completion of the trial period, the insoles were collected, and the experimenters completed a questionnaire with each of the recruits. The questionnaire obtained information about how long the recruit wore the insoles, and whether the insoles got wet or caused blisters or pain. All the insoles that had been worn for the full test period were assessed mechanically using the drop test procedures, and their stiffness characteristics were measured using the Instron apparatus, as previously described.
One-way analysis of variance was used to determine any significant differences (P < 0.05) between the insoles. Tukey’s method of pairwise comparisons was used to identify specific differences between insole conditions (21).
Laboratory mechanical tests.
Stiffness results for the five insoles after 0–100,000 impacts are shown in Figure 2. For the incremental increases illustrated, the most marked changes to all insoles occurred from 0 to 10,000 impacts, and only small changes occurred beyond 40,000 impacts. Forty thousand impacts were therefore selected to degrade the insoles to be worn for the biomechanical assessments. The stiffness data indicate that insole A has the lowest stiffness when new and the magnitude of the increase in stiffness with repeated impacts was the least for this insole type.
Drop test results for the new and degraded insole samples (40,000 impacts) are presented as percentage reduction in peak deceleration compared with the boot alone (Fig. 3). Insole C demonstrates the greatest absorbency and insole D the least. For all insoles, there were reductions in impact-absorbing ability after degradation, with insole A showing the most marked reduction.
Biomechanical assessment of new and mechanically degraded insoles.
All insole conditions were compared with the control (no insole) condition and with the standard issue insole (insole D). Where results for the new and degraded insoles are different from each other, these are indicated in the text using (n) and (d), respectively.
No significant differences were identified in peak impact force (Fig. 4). Time of peak impact force was significantly later for insole C than for the control and insole D conditions (Fig. 5), suggesting a lower average rate of loading. Peak loading rate was significantly (P < 0.05) lower for insole C (n) than for the control (Fig. 6). Peak loading rate was also lower (P < 0.05) for insole A (d), insole B (n and d), and insole C (n and d) than for insole D (d), indicating a relatively high rate of loading for the standard issue insole when degraded. Peak active force was significantly (P < 0.05) lower for the insole C (d) condition than for insole D (d). There were no significant differences identified in the time of peak active force.
Statistical analysis highlighted no significant differences between insole types for any of the angles (knee, foot, ankle, and thigh) before foot strike or in the peak knee flexion angle during stance. However, peak ankle angle was significantly greater for insole C (n) than for the control and insole D conditions (P < 0.05), indicating a reduction in peak ankle dorsiflexion. Peak ankle angle was also significantly greater for insole C (d) than for insole D (d) (Table 1).
Field testing of insoles.
Of the recruits issued with insoles for field wear, 35 recruits wore their insoles for the duration of the assessment period. This provided 11 pairs of insole A, 5 pairs of insole B, 13 pairs of insole C, and 6 pairs of insole D. During the assessment period, all of the insoles became wet on occasions and were dried by the recruits in the drying room. The stiffness characteristics and shock absorbency of the insoles that had been used for the entire assessment period were tested in the laboratory. Statistical analysis revealed that the insole C samples that had been degraded by human wear were significantly (P < 0.05) less stiff than the other worn insoles and that insole A worn samples were significantly (P < 0.05) less stiff than worn insole B and insole D (Fig. 7). The shock absorbency (presented as percentage reduction in peak deceleration compared with boot only) of insole C after human degradation was significantly (P < 0.05) greater than that of worn insoles A and B, which in turn were significantly (P < 0.05) greater than worn insole D (Fig. 8).
The marked increase in stiffness of all insole samples after 10,000 mechanical impacts (simulating 25 km) is consistent with the findings of previous studies (2,11). Also consistent with previous studies, there appears to be a threshold at approximately 40,000 impacts (100 km) beyond which minimal changes occur in the mechanical properties of typical insole materials. This implies that if an insole maintains an adequate ability to absorb impacts up to 100 km of wear, then this ability will be maintained with continued use.
The mechanical test results of the present study indicate that insole A has the lowest stiffness of the test insoles, both when new and when mechanically degraded (Fig. 2). The remaining insole samples each have similar stiffness values. Despite the much lower stiffness for insole A, this insole does not provide the most impact absorption when evaluated using the drop test procedure (Fig. 3). It is possible that this material “bottoms out” when subjected to the load applied by the impact test. Alternatively, the viscous properties of the test materials may have contributed to the different ranking of materials in the two tests. As viscoelastic materials, the properties of the insoles will have been influenced by the rate at which load is applied. The Instron apparatus quantified stiffness using a rate of loading comparable to that applied during heel strike in running. The impact test involved a mass impacting directly with the material under the influence of gravity, resulting in a relatively short time to peak force, and thus a greater rate of loading, for this test procedure. The viscoelastic nature of the test insoles could therefore have resulted in insole A having a relatively low resistance to deformation in the stiffness test using the Instron apparatus, but not necessarily providing the most impact absorption in the impact test.
It is evident that the use of stiffness alone is not sufficient for the identification of insole materials most suitable for use in reducing impact loads during running. Because insole C causes the greatest reduction in peak deceleration in the drop test (greater than 50%) and this insole is also the only test insole to reduce impact loading during running compared with the control condition, it may be that the drop test procedure is more suitable than the measure of stiffness for the identification of relative differences in impact-absorbing ability of insoles when used inside a military boot. Further detailed study of stiffness and damping properties of insole materials under different test conditions is required to investigate this suggestion.
The measurement of ground reaction force to quantify impact-absorbing ability of the test insoles during running has revealed that only insole C reduces peak loading during heel impact, as indicated by the later peak impact force and reduced peak loading rate (Figs. 5 and 6). The hypothesis that all new samples of the test insoles would reduce peak impact loading is therefore rejected. This hypothesis was based on the findings of a previous study in which a range of insoles were tested, and all were found to reduce peak heel pressures in running when placed in military footwear (24). Because the insoles in the present study were similar to those of this previous study, it is suggested that in-shoe pressure is more sensitive to changes in cushioning of the shoe-surface interface than ground reaction force variables. It is therefore recommended that assessment of the impact-absorbing ability of insole samples should include the measurement of in-shoe pressures, allowing the detection of changes in distribution of load in addition to resultant force data. The finding that the magnitude of peak impact force was similar for all insole conditions is consistent with the majority of studies where peak impact force has been measured for variations in shoe and insole materials (17,5). Previous suggestions that the rate of loading is more sensitive than the magnitude of impact force to changes in the cushioning provided by the shoe-surface interface (15,5) are therefore supported by the observed reduction in peak rate of loading for insole C in the present study.
The reduction in rate of loading of impact force during running when using insole C indicates that this insole cushions impact loading when placed in a military boot (Fig. 6). However, the clinical (practical) significance of the observed reduction in loading rate for insole C is unclear. Some authors have associated rate of loading of impact force with injury occurrence. In particular, Radin et al. (20) reported that individuals presenting with anterior knee pain exhibited a greater rate of loading of impact force than uninjured subjects. In addition, Hreljac et al. (14) detected greater peak impact force and rate of loading for runners with an injury history compared with those reporting no previous injuries. In contrast, Stephanyshyn et al. (22) reported no difference in impact variables between runners developing an injury and uninjured individuals. Although further research is required to confirm associations with injury, it is suggested that changes in loading rate may highlight differences in injury susceptibility. In addition to insole C resulting in a lower peak loading rate in running compared with the boot only condition, the peak loading rate was lower when using most of the insoles compared to the degraded standard issue insole (insole D) (Fig. 6). It is therefore speculated that degraded samples of insole D may have a detrimental influence on the loading of the lower extremity during running by causing a greater rate of loading than most other insole conditions and possibly increasing the likelihood of injury.
The choice to use a controlled mechanical procedure for preparation of degraded insoles in preference to actual wear by recruits was made based on difficulties in controlling insole loading during wear and the additional time involved in repeatedly testing insole samples at different stages of wear. However, the comparison of field tested (human degraded) samples with mechanically degraded insoles highlights different levels of degradation experienced by insole materials when mechanically loaded compared with during actual use (Figs. 3 and 8). For all insole types, the impact-absorbing ability indicated by the drop test was markedly lower after mechanical degradation than after field wear. Also, with the exception of insole A, stiffness increases compared with new samples were greater after mechanical degradation.
The observation that, in general, the mechanical degradation has had a greater influence on material properties of the insole samples than field wear may be the result of the metal impacting head used in the mechanical degradation being harder than the human heel, as previously suggested by Baumann et al. (1). An alternative explanation for the smaller changes in insole properties when worn in the field may be the relatively soft grass surface much of the field miles were performed on. The choice of loading conditions for the mechanical degradation was made using typical loading conditions measured in a previous study of running in military boots where an asphalt running surface was used (24). It is possible that the loading experienced by the insoles in the field was lower because of the softer running surface, although studies of surface effects on impact forces and pressures have rarely detected differences between surfaces (8,9). The difference may also be contributed to by the recovery intervals between periods of loading when insoles are worn in the field, compared with the repeated application of all 40,000 impacts in the mechanical degradation process. Because it has previously been demonstrated that typical insole materials recover some of the compression set after a period of recovery between periods of loading (11), the temporal differences in the loading applied during mechanical degradation and field use of insoles may have contributed to the lower reduction in stiffness observed for the field tested insoles.
Because the mechanical degradation influenced insole materials to a greater extent than field wear, and as the mechanically degraded samples of insole C have been found to reduce impact loads during running, it is suggested that the impact-absorbing ability of this insole will be maintained during actual wear. However, there is likely to come a time during actual use when the material will break down owing to factors such as moisture and shear loading, as indicated in previous studies of insole materials (18,19). The markedly lower stiffness of insole C samples when tested after human degradation, compared with the stiffness of both new and mechanically degraded insoles, suggests that such factors are influential. The discrepancy in stiffness values highlights a large change in the mechanical behavior of this insole type after actual wear. It is possible that the conditions during wear may have caused the open cell foam of this insole to break down. Alternatively the EVA structure supporting the foam material may have experienced a reduced stiffness or may have become less well attached to the shock-absorbing material. The positive effects of lateral constraint of insole materials by bonding them to other substances in the production of insoles have been previously highlighted (4). Insole C was the only test insole to be constructed by bonding to a stiffer material, possibly contributing to the superior impact-absorbing ability exhibited by this type of insole. Although not visually apparent for the tested insoles, it is possible that a loss of bonding between the insole material and the EVA footbed may have contributed to the reduced stiffness of this insole type after use. Biomechanical testing of insoles that have been worn in the field under carefully monitored conditions is therefore suggested before recommending the routine use of insole C.
The measurement of joint flexion in the present study has highlighted a significantly lower peak ankle dorsiflexion for insole C compared with the boot only and standard issue insole conditions. This may be the result of less ankle flexion being required to contribute to cushioning of ground impact for these running conditions. Alternatively, the reduced dorsiflexion may be the result of the heel lift provided by this shaped insole. Based on literature evidence demonstrating an increased energy cost associated with joint flexion (12), it is speculated that this reduction in peak ankle dorsiflexion may reduce the energy cost of running by reducing the work done by the recruit in the cushioning of impact. Although the reduction in ankle dorsi-flexion for the insole C condition is relatively small (1.4°), this reduction may also reduce the likelihood of injury occurrence, specifically Achilles tendon injury. This suggestion is supported by evidence that a reduction in ankle dorsiflexion of 1.4° results in a reduced triceps-surae muscle-tendon complex length in the region of 0.5% (7). Because the Achilles tendon experiences strain that is close to its failure point during each running step (in the region of 5% of resting length), any reduction in the peak strain is likely to be beneficial in reducing the likelihood of tendon damage. The observation that lower-extremity joint angles immediately before ground impact are not influenced by the use of insoles does not support observations of de Wit et al. (6) of an active adaptation occurring prior to ground contact in response to changes in artificial cushioning. However, the conditions used in the present study are not as extreme as the barefoot versus shod running conditions studied by these authors.
The combined use of mechanical, biomechanical, and field testing has facilitated the identification of an insole type suitable for use with military footwear. The results of any single test used in this study may not have been sufficient to justify the selection of a specific insole. However, combining results across the different investigations highlights that insole C is consistently ranked highly. This insole ranks highest in terms of mechanical impact-absorbing ability (before and after degradation) and impact force reduction during running. It is therefore concluded that, although mechanical tests indicate that all insoles have potential for reducing impact loads during running, and for maintaining this ability over time, insole C is the only insole to reduce peak loading during running. It is suggested that this insole is therefore the most likely to influence injury occurrence.
The authors would like to acknowledge the help of Dr. A. C. Collop and Mr. T. Singleton and the technical staff in the School of Civil Engineering, University of Nottingham. We would also like to thank the Commandant and Medical Staff at Commando Training Center, Royal Marines, and Dr. R. J. Pethybridge from the Institute of Naval Medicine for his assistance with the statistical analysis.
This work has been carried out with the support of the Ministry of Defense (Navy).
The results of this study do not constitute endorsement of any product by the authors or by ACSM.
1. Baumann, W., B. Krabbe, and P. Galbierz. Life characteristics of running shoes and shoe material. Proceedings of the Second Symposium on Footwear Biomechanics, Cologne, Germany
, 1995, pp. 38–39.
2. Campbell, G. J., M. Mclure, and E. N. Newell. Compressive behaviour after simulated service conditions of some foam materials intended as orthotic shoe insoles. J. Rehabil. 21: 57–65, 1984.
3. Clarke, T. E., E. C. Frederick, and L. B. Cooper. Biomechanical measurement of running shoe cushioning properties. In: Biomechanical Aspects of Sport Shoes and Playing Surfaces, B. M. Nigg and B. A. Kerr (Eds.). Calgary: University of Calgary, 1983, pp. 25–33.
4. Cinats, J., D. C. Reid, and J. B. Haddow. A biomechanical evaluation of Sorbothane. Clin. Orthop. Relat. Res. 222: 281–288, 1987.
5. de Wit, B., D. de Clercq, and M. Lenoir. The effect of varying midsole hardness on impact forces and foot motion during running. J. Appl. Biomech. 11: 395–405, 1995.
6. de Wit, B. D. de Clercq, and P. Aerts. Biomechanical analysis of the stance phase during barefoot and shod running. J. Biomech.
7. Dixon, S. J., and D. G. Kerwin. The influence of heel lift manipulation on sagittal plane kinematics
in running. J. Appl. Biomech. 15: 139–151, 1999.
8. Dixon, S. J., M. E. Batt, and A. C. Collop. Artificial playing surfaces: a review of medical, engineering and biomechanical aspects. Int. J. Sports Med. 20: 1–10, 1999.
9. Dixon, S. J., A. C. Collop, and M. E. Batt. The influence of changes in running surface on ground reaction forces and lower extremity kinematics in shod running. Med. Sci. Sports Exerc. 32: 1919–1926, 2000.
10. Evans G. W. L. Stress fractures at Commando Training Centre Royal Marines, Lympstone: a retrospective survey. J. R. Navy Med. Serv. 68: 77–81, 1982.
11. Foto, J. G., and J. A. Birke. Using bench top methods to evaluate dual-density materials used in therapeutic footwear. In: Proceedings of the Fourth Symposium on Footwear Biomechanics, Canmore, Canada, E. M. Hennig and D. J. Stephanyshyn (Eds.). 1999, pp. 40–41.
12. Frederick, E. C., T. E. Clarke, J. L. Larsen, and L. B. Cooper. The effects of shoe cushioning on oxygen demands of running. In: Biomechanical Aspects of Sport Shoes and Playing Surfaces, B. M. Nigg and B. A. Kerr (Eds.). Calgary: University of Calgary, 1983, pp. 107–114.
13. Gardner L. I., J. E. Dziados, B. H. Jones, et al. Prevention of lower extremity fractures: a controlled trial of a shock absorbent insole. Am. J. Public Health 78: 1563–1567, 1988.
14. Hreljac, A., R. N. Marshall, and P. A. Hume. Evaluation of lower extremity overuse injury potential in runners. Med. Sci. Sports Exerc. 32: 1635–1641, 2000.
15. Lees, A., and P. J. Mccullagh. Preliminary investigation into the shock absorbency of running shoes and shoe inserts. J. Hum. Mov. Stud. 10: 95–106, 1984.
16. Milgrom C., M. Giladi, and H. Kashtan. A prospective study of the effect of a shock-absorbing orthotic device on the incidence of stress fractures in military recruits. Foot Ankle 6: 101–104, 1985.
17. Nigg B. M., W. Herzog, and L. J. Read. Effect of viscoelastic shoe insoles on vertical impact forces in heel-toe running. Am. J. Sports Med. 16: 70–78, 1988.
18. Pratt, D. J. Medium term comparison of shock attenuating insoles using a spectral analysis technique. J. Biomed. Eng. 10: 426–429, 1988.
19. Pratt, D. J. Long term comparison of some shock absorbing insoles. Prosthet. Orthot. Int. 14: 59–62, 1990.
20. Radin, E., K. H. Yang, C. Riegger, V. L. Kish, and J. O’Conner. Relationship between lower limb dynamics and knee joint pain. J. Orthop. Res. 9: 398–405, 1991.
21. Scheffé; H. Analysis of Variance. Chichester: Wiley, 1959, pp. 55–89.
22. Stephanyshyn, D. J., P. Stergiou, V. M. Y. Lun, and W. H. Meeuwisse. Dynamic variables and injuries in running. In: Proceedings of the 5th Symposium on Footwear Biomechanics, Zurich, Switerland, E. Hennig and A. Stacoff (Eds.), 2001, pp. 74–75.
23. Whittle, M. W. The Use of Viscoelastic Materials in Shoes and Insoles: A Review. Magister Corporation, 1996, pp. 1–7.
24. Windle, C. M., S. M. Gregory, and S. J. Dixon. The shock attenuation characteristics of four different insoles when worn in a military boot during running and marching. Gait Posture 9: 31–37, 1999.
25. World Medical Association Declaration of Helsinki. Recommendations guiding physicians in biomedical research involving human subjects
. Adopted by the 18th World Medical Assembly Helsinki, Finland, 1964, and amended by the 52nd World Medical Association General Assembly, Edinburgh, October 2000, pp. 1–5.
Keywords:©2003The American College of Sports Medicine
GROUND REACTION FORCE; SAGITTAL PLANE KINEMATICS; KINEMATIC ADAPTATION; LOWER-EXTREMITY STIFFNESS