The new unwritten rule in Hollywood, you know, is no nudity at all; but if there is any, it’s men’s. So we have now back, profile and even good front male nudity in the few films that dare show any skin at all.
I find it liberating for men. After all these years of being in the hidden, they can finally show what they’re most proud of. Bravo!
How long until your feminazi group of choice starts denouncing Hollywood’s rampant discrimination against women and vindicating the right of women and minorities to show tits and vaginas on the screen, just like their white male Western counterparts’ counterparts?