Hey all,
As the title suggests, I am looking for a new TV series to watch. I have loved GoT, even though it took a long time for my friend to convince me to watch it. However, now season 4 is over, I am looking for a similarly addictive series. So far, The Walking Dead and Breaking Bad seem to be the top contenders. Which one do you think I should start watching first (I'll inevitably end up watching both)? What does each one have to offer?
Let me know what you think about both of them, and any suggestions are appreciated!