As I write this, the sun is shining and it’s almost 50 degrees out. After a winter in which I’ve routinely run at a stiff crawl in single digit weather, dodging ice, snow banks, and plow trucks, it feels so balmy that I want to celebrate the spring weather with physical activity. Like many other runners, I find myself thinking about races, speed work, training plans, and ambitious summer goals as soon as we hit March.
Now, the obvious risk of suddenly reimmersing yourself into upgraded training is being sidelined with an immediate injury. Many runners heed the 10% rule when it comes to increasing mileage after a down time in training. That is, overall weekly running mileage shouldn’t be increased more than 10% each week, and optimally every 4th week should be a cut-back week in which no further increases in mileage are attempted and mileage may even be reduced. So, this might look like: Week 1 (20 miles), Week 2 (22 Miles), Week 3 (24 miles), Week 4 (20 miles). There are many variations in a plan, but the premise of the 10% rule remains a constant for reducing the the risk of injury.
But how valid is this approach? To answer that question, I took a quick run through Pubmed and found a couple of recent studies on the topic. The first looked at risk of injury occurrence in 873 runners who had increased their weekly mileage by 10% or less, 10-30%, or more than 30%. Interestingly, runners who increased their weekly running mileage by more than 30% had a significantly higher incidence of distance-related injuries, including patellofemoral pain, iliotibial band syndrome, medial tibial stress syndrome, patellar tendinopathy, gluteus medius injury, greater trochanteric bursitis, and injury to the tensor fascia latae. However, there was no impact of increasing running mileage by greater than 10% on other overuse, pace-related or traumatic injuries, such as Achilles tendinopathy, plantar fasciitis, tibial stress fracture, and hamstring injuries. This is displayed on the included graph, where it can be seen that risk of injury across cumulative mileage for the distance injuries (blue line) is greater in the group (Graph B) who increases mileage >30% compared to the group who increases it less than 10% (Graph A). So, this observational study suggests that the 10% rule may be effective for preventing some, but certainly not all, running injuries.
Even more persuasive, though, is a randomized controlled trial of 532 novice runners assigned to a standard 8-week
training program (control group) and an adapted, graded, 13-week training program (intervention group) as they prepared for a 4 mile running race. Importantly, the graded 13-week training program was based on the 10% training rule to investigate whether this slow incremental increase in mileage prevented running injuries. And…it did not. The injury rate was identical in each group: incidence was 20.8% in the intervention group and 20.3% in the regular control training group. The included survival curve shows the decreasing (and similar) numbers of runners surviving injury-free in each group as training duration (running exposure) increased across the study. Authors concluded that when “preparing to participate in a 4-mile run, it does not matter how you get there (either fast or slow)—the risk of sustaining an RRI [Running Related Injury] is the same.”
So is the 10% rule valid? Should it be followed? It’s probably a good general rule to avoid distance-associated overuse injuries that can be exacerbated by rapidly increasing mileage. But it’s not foolproof, it certainly doesn’t apply to everyone, and unfortunately there’s no guarantee that following it will keep you injury-free once the days get longer and the sun comes out.