Sexism in Hollywood

Is Hollywood Sexist?

Hollywood has been going through a lot of scrutiny lately with the recent Oscar nominations being announced where there have been no female nomination at all in any of the categories this has lead people to believe that Hollywood is sexist. There have also been a number of films of recent where the female lead has less lines than they have had before according to studies. Another reason why this question has started to come up again is because of the number of rebooted films which originally had male leads and were very successful but now that they have been rebooted with a female lead they haven’t been anywhere near as successful as the originals. It has also been stated that in 2017 and in 2018 females spent half as much time on screen compared to males.

Hollywood has been known for being sexist for pretty much since the start of Hollywood mostly due to the studio system which Hollywood was run under. Which was terrible time for actor in general who were treated awful since actors didn’t have enough freedom as they do now. Actors literally were told what films they could act in without any real say and studio would trade actors for particular films so an actor could be working for one studio one day then all of a sudden taken to another studio without knowing.

It is hard for Hollywood not to be sexist especially if their see that films that have female leads aren’t as successful as the ones with male leads. At the end of the day the goal for these films is to make their money back plus more and they know that is least likely to happen when the lead of the film is female. It is sad to say but there are many reason that the female lead may not be popular amongst the public. Maybe writers just can’t write good scripts or story’s with a female lead so they just write it as if it’s a male lead and just have a female act it which doesn’t work because the film won’t feel realistic enough for the audience to get into the story. This is because films are supposed to feel realistic and are supposed to take you to another world which should feel believable.

I believe that Hollywood will always be sexist just because of so many different factors about women outside of it that it would have to change in our society before any change happens in Hollywood. As long as males are the dominant earners females will always be playing catch up. Maybe when there is a change in that maybe there will be a change in Hollywood but in my opinion I don’t see it changing anytime soon only because I feel like there are so many other problems the world and society has to deal with first before they can sort out the gender issue.

Leave a comment

Design a site like this with WordPress.com
Get started