It's always women who get brainwashed, isn't it? Whether by patriarchal gender norms or feminism, I'm constantly being told women don't know what they want and have no agency and are very impressionable. I think it's a Christian culture thing that goes back to the Sin of Eve. https://twitter.com/pookietooth/status/956933568743161856 …
I said Christianity blamed women for the sin of Eve. Men are understood to have different sins tho.
-
-
It blames women for all problems related to sex and reproduction, though. Women are held responsible for unwanted pregnancy, sexual assault, rape, pain in child birth, etc. - and men are forgiven; not women
-
Why are we arguing about this? I blamed Christianity for demeaning women and you feel the need to explain to me how Christianity is bad for women?
-
No, you seemed to be implying that Christianity is the reason I would think that women are forced take on the feminine roles -- that Christianity makes me think women are victims -- when it's actually Christianity that tends to force women into feminine roles in the first place
-
What? Oh, go away. Too silly. Bored now.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.