Healthy Living

These Doctors and Nurses Reveal What Grey's Anatomy Has Wrong About Medicine

However, don't be fooled. There are a few things that Grey's Anatomy has right.

For all its ridiculous melodrama, there are a few things Grey’s Anatomy gets right about working in a hospital. Ultimately, Grey’s Anatomy is intended to amuse and entertain—not to educate people about the reality of working in the medical profession. The show is fictional, so its portrayals of doctors, nurses, and patients should be taken with a hefty dose of salt.

Nevertheless, although Grey’s Anatomy certainly gets a lot of things wrong, there are a few things it gets right. The show should by no means be interpreted as a realistic representation of what goes on in a real-life hospital. Yet, some elements of the show are surprisingly realistic. Despite its glaring inaccuracies, the show does a decent job depicting the anxiety, uncertainty, and overwhelming pressure of working as a new doctor.

Photo source: ABC