T.W.
Thursday 22nd October 2009 5:50pm [Edited]
15,786 posts
British dentists and plastic surgeons set to have a field day?
UK Drama Indies Told To Look For Dishier Actors
Presumably whether the actors are any good, or suit the show they're cast in, becomes rather immaterial. A retrograde step in trying to persuade society not to judge people on their looks? Or just the harsh reality of the market place?
If they came out and said "You should use less black or ethnic actors in order to sell shows", people would rightly be appalled. Yet it might well be statistically true that this would improve a show's chances of selling to certain countries. Doesn't TV/film have some minor responsibility to influence society, rather just only reflect it?