Is Health Care a Right?
Health care is fundamental to living a full life, and receiving adequate healthcare should be seen as a human right. Unfortunately, however, many countries have failed to recognize this right; fortunately there is currently an effort underway in the US to expand affordable and accessible health care options for its…