So let's keep it a thing. It's 2023 and girls feel like they should be equal to men. I mean they always felt like that but do y'all think that girls should be kissing feet? Like I mean I know guys be kissing feet and all that. Do y'all think girls should now be kissing feet? Like that should be like the new norm or are they already kissing feet and we just don't know about it. So, like, let me know which I think. Get in the comments.