Is this still the case these days? I don't even know what season I stopped watching. It was the one where Glenn died an extremely gruesome death, which kind of put me off. It just was so uneventful afterward and overall had too much gore at that point for my liking. The zombies were hardly a threat either. There was some other town with a lion too. It felt like the show was repeating itself.
Around that time I read that the show is never going to explore the virus/cure/origins of it. Same with Fear the Walking Dead so I also tuned out of after the first season. I get it, Walking Dead was never meant to focus on that, its about the people not the zombies, but what is the end goal of it all? What was the point of Fear the Walking Dead too if it ended up being the same plots just on the west coast (from where I left off)? I thought the first episode was so cool and really brought back the tension....and then that was it.
Submitted April 28, 2019 at 08:53PM by LustyGurl http://bit.ly/2GJBKgU
No comments:
Post a Comment