Sexism in Hollywood

Is Hollywood Sexist? Hollywood has been going through a lot of scrutiny lately with the recent Oscar nominations being announced where there have been no female nomination at all in any of the categories this has lead people to believe that Hollywood is sexist. There have also been a number of films of recent whereContinue reading “Sexism in Hollywood”

Design a site like this with WordPress.com
Get started