We get chance to hear the story the directors want to tell.
The ‘best’ films, in my opinion, are the ones where I leave the cinema reflecting on what it has to teach me.
I’m sure there is a place for ‘feel good’ and family films that can teach young children moral messages.
But there are also many films (as well as TV shows) that pollute our screens. We don’t really need another ‘I’m a Celeb’ or ‘Love Island’ do we?
What are some of the best films that have taught you something?
(P.S. Please send me some of your recommendations to firstname.lastname@example.org)