Do you guys think movies and TV shows are becoming quote unquote woke? Because I think I just saw some discourse on Twitter and I feel like a lot of people want are misusing the word woke in a bad way because woke just means you're just aware to racism, sojournism, sexism, etc. And I don't think that's a bad thing at all to be awake to those type of inequalities. but I do sometimes see people being annoyed that movies are now more woke. I don't know. I don't mind it if it's done in subtle ways. I don't like it when it's very very obvious, your face cringey wokeness you know I don't like that but when it's like done well, I don't mind it.