Quite self-explanatory from the title.
I watched the show a while back, and when I watched it, quite a few of the scenes stuck with me (maybe in a different way the show was going to convey). It was the first time I was really inspired to be a doctor, and I thought it showed quite a few ways doctors can really impact people's lives.
Obviously, the only problem is that it's a TV show, and not a documentary either. Would medical schools look down on me for mentioning it?