Originally Posted by
Mixdguy17
No, Traditional feminism is what made women be acknowledged as human beings as well, gave them rights, gave them the ability to be as free as men, and stopped being considered as a second-class citizen, for fucks sake women weren't even allowed to sing before! Can u fucking believe that?. They used to castrate kids (known as castratos or castratis) who were good singers, since the women at the time weren't allowed to sing for the simple fact that they were women. Just fucking imagine, kids getting drugged, then getting their balls chopped off, against their will, so they couldn't develop testosterone and sing notes that are common in women to reach. Most kids wouldn't even survive the procedure. Is that what u wanna go back to?
So no, feminism is important, bcs before that the basis of how women were treated before was inhuman, and the worst part is that it still happene sin many parts of the world, like the middle east, were they arent even allowed to get a fucking education, and thats not even close to the worst parts of their lives that they have to endure for the simple fact of just being born a woman. Like I said, whats need to be eradicated are its extremisms, or all extremism in general, just like sometimes your own ideologies seem to come off as, that is what the real cancer is.
Bookmarks