Feminism
October 14, 2015
The f-word. Many people are afraid of it, or simply may not understand it. So, what is feminism?
Feminism, according to Dictionary.com, is “advocating social, political, legal, and economic rights for women equal to those of men.” Not that scary right?
Many people see feminism as the desire for women to be better and have more rights and opportunities than men, but that’s not the case. Feminists just want to be equal and on the same page with men no matter what the situation; whether it be a job, political, or pay opportunities. For example, the wage gap has been largely discussed and debated, due to the fact that women earn 77¢ to a man’s dollar for performing the same job. Another misconception is that when women are in positions of power, they are cold, or bossy. But when a man holds the same exact position, they are sharp, confident, and a great leader. These double standards affect women in a negative way, while lifting men up.
Many women and girls are often the punchline of jokes, such as the hysterical “get back in the kitchen” joke (which is ironic, considering that the culinary arts is a male dominated industry). When women get offended by these “jokes”, it’s not that women are too sensitive, but when young girls hear these jokes, it discourages them to try and achieve more than society lets them.
With prominent female figures voicing their opinions against the idea of feminism, it can be easy for some young girls to be opposed to the idea. Stars like Katy Perry and Carrie Underwood have voiced their anti-feminist stance, both agreeing that, “The word can be a intimidating.”
Feminism is not about hating or wanting to be better than men. Girls need feminism because they are told they cannot do certain jobs because they are females, or that they cannot have the same rights and chances as men. Feminism focuses on empowering young girls to voice their opinion and stand up for themselves and other girls.