
When a doctor recommends treatment, the doctor is required to inform the patient of the risks involved with that treatment. Legally, this concept is called informed consent. It means that in order to consent to medical treatment, a patient first needs to be properly informed of the risks. You are entitled to understand. We all …