Hollywood’s political credibility now is so utterly destroyed it glows in the dark
The real War on Women was in Hollywood.
The Harvey Weinstein scandal has the potential to deal a body blow to the role Hollywood actors, and by extension the entire entertainment industry, plays in our political life.
For decades we have had to listen to moralizing by Hollywood actors denigrating Republicans and particularly conservatives. It is in overdrive in the age of Trump.
And all the while, they knew, they ALL knew about Harvey. They even joked about it publicly:
There are many accounts now coming out, but this one in by French actress Léa Seydoux sums up the film industry, ‘I had to defend myself’: the night Harvey Weinstein jumped on me:
Everyone knew what Harvey was up to and no one did anything. It’s unbelievable that he’s been able to act like this for decades and still keep his career. That’s only possible because he has a huge amount of power. In this industry, there are directors who abuse their position. They are very influential, that’s how they can do that. With Harvey, it was physical. With others, its just words. Sometimes, it feels like you have to be very strong to be a woman in the film industry. It’s very common to encounter these kinds of men.
As you listen to Patricia Arquette — whose sister Rosanna was a Weinstein target — make an impassioned speech for women’s rights during the 2015 Oscars, scan the crowd. They knew. They ALL knew.
And they knew not just of Harvey Weinstein’s predations, but that their industry was rotten to the core. Rita Panahi writing in the (Australian) Herald Sun had it right, Toxic Hollywood industry behind elite grandstanding:
HOLLYWOOD’S elite likes to portray themselves as progressive crusaders who are morally superior to the masses that make them rich. Anyone unfortunate enough to catch the preachy performances at the Oscars and Emmys could attest to the entertainment industry’s Trump-obsessed self-righteousness. But the reality is that behind the holier-than-thou grandstanding is an industry as toxic as a sewage treatment plant. The Harvey Weinstein scandal has laid bare Hollywood’s ugly underbelly, and it’s much worse than you imagined.
We often say in politics that the cover up is worse than the crime. That’s not true here, the crime was worse, but the cover up by Hollywood is bad enough that it should be the end of Hollywood’s role in politics as anything other than an object of derision.
Hollywood weaponized Democrats’ phony War on Women attacks on Republicans, but the real War on Women was in Hollywood.
To borrow a metaphor from Ted Cruz, Hollywood’s political credibility now is so utterly destroyed it glows in the dark. We need to keep it that way.