Harvey Weinstein is gone, but Hollywood is still the world of people
The entertainment industry has undergone a tectonic shift over the past two years, but many of the most influential people have remained the same.
In Hollywood, director jobs are no longer automatically performed by white people. The television writers’ room has made diversity and inclusion a top priority. The human resources departments of large media corporations are more responsive when filing complaints.
Intimacy coordinators who introduce physical consent into the artistic process are now normal for productions with sexual content. Almost two and a half years have passed since allegations of sexual misconduct against Harvey Weinstein became public, and much has changed in Hollywood.
But the entertainment industry
has acted in a certain way for decades, and not all of its aspects have changed rapidly. Despite the fact that on Monday Mr. Weinstein was convicted of two sex crimes, Hollywood largely remains the world of men. Take the Oscar, the ultimate manifestation of power and prestige. For the ninth time in 10 years, the Academy of Motion Picture Arts and Sciences did not nominate a woman for the title of best director in 2020.
Only one of the 20 current nominations received a colored person. And with the exception of Parasite and Little Women, most of the Academy’s films — Irish, Ford vs. Ferrari, Once Upon a Time in Hollywood, and The Joker were portraits of white men staged by prominent white authors