Evidence Based Practice is a term thrown around a lot these days. I hear different versions of it and it makes me wonder how many people understand and apply the concept.
The basic concept of EBP is to make sure that everything you’re doing is purposeful, efficient, safe and effective. Its basic principles are that all practical decisions made should 1) be based on research studies and 2) that these research studies are selected and interpreted according to some specific norms. Makes perfect sense, huh?
Unfortunately, a lot of trainers don’t think this way and don’t do their homework to find out if they’re programming decisions are supported by evidence.
It’s scary to think that professions like medicine, psychology, psychiatry, rehabilitation, etc. have had periods where practice was based on rather loose bodies of knowledge, or hunches. Some of the knowledge was lore, myth or theory that was based on the experiences of the practitioners. A lot of this mis-information had no valid scientific evidence, so various practices could not be validated.
This has often left the door open to quackery perpetrated by people who have little or no training at all in the particular discipline, but who desperately want to convey the impression that they do for profit, ego or other motives.
Fortunately, the scientific method has increasingly become recognized as a way to validate methods in many medical disciplines. These professions found it necessary to find a way to expose quack practitioners, not only as a way of preserving the integrity of the field, but also in order to protect the public. Even when quackery wasn’t present, these professions recognized the value in identifying what actually works and does not work so methods could be improved and properly promoted.
Our field, however, has not gotten to this point yet. There is still PLENTY of quackery out there, and it is generally accepted by the public because, unlike medicine, fitness is often not taken as seriously. There is also a lot of grey area in the world of fitness or strength and conditioning. It’s very difficult to “prove” what works because there are far too many factors that go into getting results. It’s even possible to get great results from a terrible program, so it’s really difficult to expose quackery in our profession.
In my opinion, our profession needs to move in this direction if we’re ever going to get the respect we deserve. That means that certain methods (and people) will eventually need to be exposed if the rest of us are to be taken seriously.
To move in this direction, I like to divide things up into three categories:
1. Things we KNOW work and have scientific evidence to support it (things like basic strength training, plyometrics, progressive conditioning, mechanics work, etc.). There is plenty of research available on strength & conditioning topics, so we need to continually be evaluating our practices against it.
2. Things we THINK work, but the science is a little unclear (things like foam rolling, activation, correctives, new periodization strategies, etc.). I definitely don’t think anything in this category would be considered quackery, but we need to be careful what we place here. This isn’t a place for something we WISH would work. This category is for methods that are safe and productive, but science still hasn’t figured out exactly how to apply them for maximum benefit.
3. Things we know DON’T WORK or are unsafe (training power while fatigued, distance running for speed development, power exercises over 10 reps, toning, spot reduction, no progression in strength training, poor technique, etc.)
Obviously, there is a TON of grey area in there, but you get the idea.
I try to do as much as possible from category 1 (maybe 75-85%), a decent amount from category 2 (may 15-25%) and steer clear of category 3.
I don’t think that EVERYTHING we do has to have a specific journal article supporting it, though, because when you start talking about complete programming, no scientific study is going to replicate your entire program. It’s impossible to control all the variables. That’s where long-term experience comes into play.
For example, we haven’t been able to conduct scientific studies on all parts of our program, but years of doing it has shown that it produces results. That’s evidence. It doesn’t necessarily mean that any single thing in the program is responsible for the results, but the combination of everything WORKS.
We do pre- and post-testing to evaluate effectiveness, but we can’t control for things like motivation, work ethic, instruction, attendance, and program modification. Each program is a little different because the needs of every athlete are a little different. Sometimes the differences are small, but sometimes they are massive.
It takes a long time and a lot of people, however, to gather any kind of experiential evidence that is worth relying on. So, you can’t just train one person, get results and call that evidence based. There is a LITTLE evidence, but until you replicate it over and over, it’s still worth questioning.
I don’t know how many people need to be in the “test” group, or exactly what qualifies something as evidence, though. There is a ton of grey area here and I don’t have the answer.
I do know that “it worked for me” is not very good evidence. Asking the “big guy” in the gym is not EBP. Also, using something because someone else does it is not EBP. And, basing your decisions on what elite athletes have done is also not really EBP.
I see a lot of really fast or really strong people try to tell others how to get faster and stronger, and it’s often a bunch of crap. Quackery. Genetics over-ride everything, so some people get big and fast doing just about anything (sometimes nothing).
At some point, you have to decide for yourself what you’re willing to accept as evidence. Some people need multiple journal articles. Others need experience. But, lots of people just throw stuff against the wall to see what sticks.
I think we need to move toward EBP. It’s not going to be perfect, and it’s not always going to be comfortable, but we should always be examining what we’re doing with clients. They are (or should be) paying good money and relying on us to provide safe, effective and efficient programming.
Taking these people’s money without some sort of EBP is basically selling snake oil.
I try to share as much EBP information as possible on UltimateStrengthAndConditioning.com, because I think we, as a profession, are better than that. The information is there. It’s up to us to find it, share it….and ultimately use it.
I’d love to hear how others are practicing EBT in their programming.
Excellent blog Jim! I totally agree, If EBP is not being utilized by health, wellness, strength and conditioning professionals then, in my opinion, the consumer is getting ripped off and is going to suffer in the long run.
Jim – interesting article on “evidence-based” training. I have been following the concept for many years based on my work in clinical exercise and health care. I feel the term is a bit mis-leading, as evidence based in health care usually means “what other practices are doing”. In many cases – it may not have application to actual improvement in health. In terms of strength training – we have seen a lot of application for clinical improvement over the years, and I discuss in terms of “outcomes-based” programming, which means that the practitioner is on the hook for improving health outcomes based on the exercise prescription. The good news is that most forms of exercise (properly prescribed) has tremendous benefits in terms of improving fitness AND healthcare outcomes (fatigue index, blood pressure, blood glucose, depression scale, etc.), and in a timely manner. I am hoping that as the growth of strength training (no pun intended) continues to push into healthcare realms, that more positive outcomes will be seen to the extent whereby physicians will use it as one of the mainstays of treatment modalities vs. just surgeries or pharmaceuticals.
Eric, I really like the term “outcome-based programming.” That might be even better for strength & conditioning than “evidence based” because it allows us to use methods that we know work through either experience or logical thought. Evidence based programming often makes coaches feel boxed into only methods that have be examined through a research study, so I like this. Maybe I need to re-write this. Haha.