Hollywood Whitewashed: White Males Dominate Film

Veuer 2016-02-22

Views 15

A new study conducted by two USC professors reveals that what was once thought to be an issue with the Oscars could be affecting all of Hollywood. The findings show a stark disparity between opportunites afforded to white, straight men and the rest of the industry. Kevan Kenney has more.

Share This Video


Download

  
Report form