Every year since 1987 "PREMIERE" magazine in the US has asked a panel of 15-20 well-known American critics to rate the previous year's major movies (four stars being the highest rating).

The result is a massive table of titles and ratings that takes up a two-page spread in the magazine's April issue and is scrutinised religiously by list-maniacs like me for months afterwards. Despite (though actually because of) the mainstream tastes of the critics involved, it's probably the most useful indicator of what were considered the "best films" in America in any given year.

Unfortunately, when it comes to picking the critics' Top 10 (actually Top 12, me being an extra-nice guy) for each year, there's a problem : do you rank the movies based on absolute scores, i.e. the total no. of stars received, or by averages, i.e. the total no. of stars divided by the no. of critics. What's the difference, you're thinking - and, ideally, the answer would be "none".

Trouble is, those pesky critics haven't always seen all the movies they're asked to rate (the exception being always, for some reason, Peter Travers of "Rolling Stone"). Which means the playing field isn't always level : high-profile movies get rated by everyone, which helps their absolute score, while the more obscure stuff - docs, indies, foreign films - get smaller totals but better averages (probably because the people who see them want to advertise the fact, and give them higher ratings).

Which is why, e.g., the little-known LET HIM HAVE IT makes the 1991 best-average list despite being seen by only about half the year's critics. Does that mean the critics who never saw it would have given it similar ratings if they had seen it? Bearing in mind that most of them probably chose not to see it because it didn't sound like their kind of movie, probably not. On the other hand, a well-known film like TO DIE FOR makes the 1995 absolute-scores list even though only 3 critics liked it enough to give it four stars - its average is actually not that great, but enough critics saw it for their ratings to pile up into Ten-Best status. Is that fair?

After wrestling with the above problem for, oh, maybe five minutes (a lot less than it's taken me to write about it, in fact) I finally decided to use averages, if only because anyone who prefers absolute scores need only take the magazines and (duh) add up all the little stars after each movie. (Not that it takes a rocket scientist to work out the averages, but at least you need a pocket calculator.) "PREMIERE" itself, by the way, has been using averages since 1995, and previously used absolute scores for 1990-93 ; there was no "official" Top Ten in 1987-89, and I'm not counting 1994 at all because that was the year when the compilation of results was (apparently) assigned to some mathematically-challenged temp, resulting in a Top 10 that makes no sense whichever way you look at it.

What's that you say? I should shut the hell up and just give you the results?...Oh. Okay...

Rank / Title (Director) / Average Score [out of 4]














2000 (no critics' poll)



(*): Actually has a slightly lower average than the two #11s, and should be #12, but this is how "Premiere" ranked them.

2003 (no critics' poll)