Okay, so do you guys believe in like holistic healthcare like herbs and you know fruits and stuff to heal you? Or do you believe in you know pharmaceutical companies who give you medications to suppress any quote-unquote disease you may have. What are your thoughts on this?