Most wellness models in our western culture teach that the body is the foundation for everything in life, that without a healthy body everything else suffers. I believe this is backwards thinking. Your body isn't the foundation of your health. Your body is the physical manifestation of your life experiences. When your life is out of balance your mind gets stressed. And when your mind is stressed your body suffers. The good news is that you can make changes in your life, at any time, that can positively affect your health.
Unfortunately our medical system doesn't teach doctors, or allow them the time needed, to counsel their patients on the things that ultimately manifest in the physical body. So we need to take this responsibility on for ourselves. How do we do that?
Ask yourself what you think might be the root to your medical condition. Think beyond the conventional answers like "I need to lose weight" or "I am depressed". These may be true, but they're the symptoms. You're looking for the underlying cause. Dig deeper: Is your marriage on the rocks? Do you hate your job? Are you lonely?
Ask yourself "What does my body need so it can heal?" Yes, maybe you need antibiotics or some professional counselling. But maybe you need to finish that novel, hire a nanny, forgive an old injustice, or end a relationship.
This is tough stuff.
Go easy on yourself.
Reiki can help.