By clicking “Fine”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts.
Okay, so do you guys believe in like holistic healthcare like herbs and you know fruits and stuff to heal you? Or do you believe in you know pharmaceutical companies who give you medications to suppress any quote-unquote disease you may have. What are your thoughts on this?
I am 100% into holistic healthcare I think that conventional medicine is there for a reason but it's more reactive than proactive and if you live a healthy lifestyle you don't need to take as much conventional medicine which doesn't again prevent it just treats and
While holistic health goes beyond herbs and fruits holistic health is more so a entire healing between your emotional self your physical self spiritual self and everything in between
So I prefer holistic health but there's also a way to have holistic health Inc. in your evidence based practice as well because with me being a physical therapist I believe we focus on holistic health as well as evidence-based
Fun fact, 45% of the pharmaceutical drugs on the market today are derived from plants. 20% of those pharmaceutical drugs are derived from plants found in the rainforest.
Probably and for sure holistic health is definitely more convenient for your body so the way you wanna put it under a lot of toxins that also my affect another organs in your body
I definitely prefer in holistic healthcare unless it's like some thing like really detrimental like obviously like heart of shoes like things like that but I do believe in like herbal remedies