Please, if this is breaking any rules, move it to the correct section as it has taken me a while to put this all together.
Before I start, I am Male, and Im about to start a controversial debate, on what I believe to be that men in the west have less equal rights than women, due to the plague of western feminism.
(Before I go into detail again, I think there is nothing wrong with Feminism and Equality, I explain that below).
Over the last few years I have seen a strong feminist movement in the west (mainly UK and USA), which has sparked many females into wanting to have equality.
This includes being able to wear what they want without being judged.
(Skimpy clothing is SCIENTIFICALLY DESIGNED to be to sexually appealing, although it only becomes judging when they get the unwanted attention).
How they want men to stop objectifying and sexualizing women.
(It happens both ways, this if for the media to blame, I hear "Channing Tatum is Hot" just as much as I hear "Emma Watson is Hot"
They are being oppressed by men (extreme feminists say this).
(not really the case though, show recent example of where this has ever happened, solely based on gender?)
and many more points.
But id like to focus on what is being seen every day.
A female punches a male, the male fights back; then the male is perceived as the danger, and the female is seen in the right.
A male punches a female, female fights back, then bypassers run to rescue the female.
But this isnt really evidence?
Take a look at these social experiments, the first video shows a man groping a women and look at the publics response:
This second video shows what happens when a women does it to a man:
Here is a clear example of roles being reversed:
(be sure to checkout the last bit)
Again a stable argument can be that I am showing lack of evidence, but just think and look around. Western feminism has become a joke, and its come to the point where women have more rights than men.
I feel men cannot publicly come out about this, due to public shaming.
I do think feminism and equality should be a thing, but mainly in eastern countries or parts of the world where women actually dont have many rights.
I dont really see the point in protesting topless for rights to be topless in public while there are little girls in the east being shot for going to school?
Or what about saudi arabia? protest there and see what happens?
I know this is the extreme case of feminism, but its becoming more frequent to the point where its getting ridiculous.
Anita Sarkeesian is another example of a joke.
The stole $100k+ and promised to deliver 5 videos on how the media portray women as objects and sexualize them. This was a couple of years ago and has failed to produce the 5 videos, and has moved onto other projects which requires other people to pay her.
Please women, do not follow her example.
But, what are your thoughts on equality in the west? state your points if possible please, if like to have a debate which people who think the opposite.
What are your Opinions on Gender Equality in the west?
|Four hours left to win £100 of Amazon vouchers!! Don't miss out! Take our short survey to enter||24-10-2016|