|
In an attempt to overcome the problem of comparing different fields of competitors in assessing the effect of climate on running performance, McCann and Adams devised an ingenious study. In their work, each runner served as his own control. The runner's performance in the NCAA Championships (where the top six in each race are awarded All-American honors) was compared with his personal-best performance for that season. Since it was assumed that each of these performances, held on a track and with identical distance requirements, would represent maximum performances, it could then be assumed that the major differences in performance were due to race conditions. The difference between NCAA Championships and Season Personal Best times were then averaged for the All-Americans and expressed as a percentage change. Here is the result of the 10K race, over a 7-year span:
This study supports the notion that changes in heat stress, in this case measured as Wet Bulb Globe Temperature (a sensitive measurement of temperature, relative humidity, wind and radiant heat effects), produce changes in athletic performance. Furthermore, these results show that the effects of heat stress are dose-related: Even small temperature differences produce change. Increasing the WBGT from 65°F (the Marine Corps break point for increasing exercise-related heat injury) to 82°F (the American College of Sports Medicine "Event Delay Threshold" point) produces a 48 second increase in 10K times, which is substantial. What is the optimal temperature for ideal long-distance running performance? The climatic chamber work of Galloway and Maughan suggests that 51°F is ideal. McCann and Adams' data suggests that optimal performance occurs between WBGT's of 61 and 66°F. However, their approach did not examine the possibility that performance would have improved even more at lower temperatures. |
[Home]
[The Chart Room]
[The Map Room]
[The Chart Store]
[The Library] |