I respect your opinion and I agree with you on the premise that it's who's best suited for the job that should get it but believe me when I say that alot of women today don't get or never knew what the founding principles of feminism are. It's truly become more of a fight for supremacy these days. I used to say I'm a feminist because I believed that women should have equal rights and opportunities but according to what I see these days, that's not what feminism is about anymore.
Most women just want to oppress men, it's not a fight for equality with those kinds of women, it's a fight for supremacy. It pains me when I see men get positions over women simply because they're men, especially when the women are the better people for the job, that's what women should be kicking against. These days they're more concerned with bad mouthing men and making it look like we're "scum" as they say.
I mean more men than women are victims of violent crimes, more men than women go to war, more men than women die in wars, and that's a fact. For feminism to actually mean something, as much as women want respect and equality, they have to give men that respect as well. It supposed to be about equality but alot of women have made it a power tussle for supremacy.
View this answer on Musing.io
View this answer on Musing.io
I respect your opinion and I agree with you on the premise that it's who's best suited for the job that should get it but believe me when I say that alot of women today don't get or never knew what the founding principles of feminism are. It's truly become more of a fight for supremacy these days. I used to say I'm a feminist because I believed that women should have equal rights and opportunities but according to what I see these days, that's not what feminism is about anymore.
Most women just want to oppress men, it's not a fight for equality with those kinds of women, it's a fight for supremacy. It pains me when I see men get positions over women simply because they're men, especially when the women are the better people for the job, that's what women should be kicking against. These days they're more concerned with bad mouthing men and making it look like we're "scum" as they say.
I mean more men than women are victims of violent crimes, more men than women go to war, more men than women die in wars, and that's a fact. For feminism to actually mean something, as much as women want respect and equality, they have to give men that respect as well. It supposed to be about equality but alot of women have made it a power tussle for supremacy.
View this answer on Musing.io