When movies show a traditional gender roles world (like, historically or culturally), and then the women rise up and Do Feminism, I just don't buy it. I was raised in a super traditional gender roles world and I didn't see a single woman around me Doing Feminism.
I definitely believe this exists, and it makes sense to me that it's a generational thing. I also seem to experience less sexism than my female friends - less catcalling/harassment, hate mail, etc. I am not sure why but I suspect I "give off a vibe" or something.