Looking for a term/articles
March 7, 2011 8:47 AM Subscribe
Is there a term for the idea that women are (at least partially) the enforcers of patriarchal gender norms and practices?
posted by kittenmarlowe to Society & Culture (13 answers total) 6 users marked this as a favorite
I recently had a conversation with a Sierra Leonian professor I know about the practice of female circumcision, where he commented that, in his experience, it was the African women who were most likely to defend the practice. Around the same time, my own students brought up in classroom discussion that they believed that women were much more likely to police other women's bodies and gender presentations than men were.
While either of these claims may be true or false, it made me curious if there was some scholarship on this concept. I've heard vague mentions in feminist sources about the role women play in promoting patriarchal assumptions, but I haven't been able to find anything specific. Does anyone know any authors/books/scholarly articles on this subject, preferably academic?