That’s Right! I said it! If you think about it, Doctors don’t heal your body. Your body HEALS ITSELF. Doctor’s help your body do its job (healing) by treating your body through their particular form of healthcare, whether that be chiropractic adjustments, medicine, acupuncture, nutrition, or even surgery. Think about it! Basically treatment is geared towards somehow changing your body’s internal environment (physically, chemically, or otherwise) to remove the interferance to healing.
So what does that mean? That means that YOU hold the power WITHIN you to heal. Sometimes you need help, but isn’t it neat to think we all have that innate ability to heal?