Нажимая "Хорошо", вы соглашаетесь на сохранение файлов cookie на вашем устройстве для улучшения навигации по сайту, анализа использования сайта и помощи в наших маркетинговых усилиях.
Do you think series get worse as they bring out new seasons? For example, Prison Break. It started so good. Season one was unreal. Season two was somewhat alright. And then the rest of the season just went downhill. It's happened with so many peaky binders I remember. Season five and six just weren't as good as the seasons before. It just happens a lot and I just I don't know it just ruins like a lot of good shows. Do you like it or not?
Seems to be the case for most TV series but you would hope not you know I feel like when when they say when they set the tone in the beginning and the beginning of the series is so good I just use it does get worse you know this is a natural thing that will happen
Do you know what it depends on how sometimes some serious get worse and I got new seasons but then some series get better examples tofu when you put the new season of stuff I got better but I think there for the power of this is that sometimes it's a bit shit but then at the pens summer Sammy
One thing to consider as you think about this if it gets worse is if they change directors or writers. So it might not be the same writers or the same director in the beginning. So the movie might be directed differently or the script might be coming out differently. So for example, I think like Game of Thrones came out very strong. The HBO does this. They do amazing series, but they shit the bed on the last episode. Don't know why.