Entertainment

21 of the Biggest Lies Hollywood Movies Told Us About Sex

By  | 

Netflix

Hollywood movies can make anything look glamorous, from first kisses and heated arguments to painfully awkward encounters and getting the flu. But you know what? These things don't hold a candle to how Hollywood has handled sex in popular films.

Every single sex scene is carefully choreographed, performed in front of a live cast and crew that watches along intently, looking for little things that could be fixed or fine-tuned. And then we, the viewers, see the final product and we think that it all looks so natural. We gaze in awe and we get aroused, sometimes asking ourselves: “Why in the world don't I ever have steamy sex like that?!”

Well, if Hollywood has taught us anything, it's the fact that they don't aim to be authentic – they aim to entertain (or, in this case, to titillate). Most of us are already aware of this, but still, those over-the-top love scenes haven't stopped us from setting the bar ridiculously high when it comes to our own sex lives… And that can be quite problematic.

See some of the biggest lies that mainstream films have told us about sex:

comments