The Trouble Of Quantifying Art

It’s received wisdom that movies adapted from written material are, to steal a phrase from an excellent album by The Tangent, not as good as the book. I tend to agree with that wisdom, but there are some notable exceptions (Dangerous Liaisons, The Sweet Hereafter, LA Confidential). Is there some way we can tell, with math and numbers and stuff, which ones work better than others?

No. No there isn’t.

That doesn’t keep people from tying. Back when the last part of The Hunger Games film saga came out, stats guru Nate Silver purported to identify the 20 “most extreme cases” of film adaptations that failed to live up to the quality of their book source material. When I clicked the link I was actually hoping Silver might have broken out of his usual routine and embraced the ambiguous in the world.

Alas (footnotes omitted):

there are extreme cases where book-lover rage is justifiable. Which cases? I pulled the Metacritic critic ratings of the top 500 movies on IMDB tagged with the “based on novel” keyword. I then found the average user rating of the source novel for each film on Goodreads, a book rating and review site. In the end, there was complete data for 382 films and source novels.

The results are kind of fun to look at. Remember, Silver’s most interested in the divergence between good books and bad movies, and vice versa, so a great adaptation of a great novel kind of falls through the cracks. Still, who knew that Up in the Air was so much better a film than the novel? But that list points out some of the problems with Silver’s undertaking. Is it completely accurate to call Apocalypse Now an “adaptation” of Heart of Darkness, or is it simply inspired by it? And Dr. Strangelove, while it may have adapted the plot of Red Alert, it turned the ideas of it on their head, playing it as dark satire, rather than serious, suspenseful drama (it got that from Failsafe). The reverse list is a little easier to understand, for the reasons Silver mentions.

But the real problem isn’t in the particular results, it’s in the method. Mainly, Silver is comparing apples and oranges. He somewhat admits this in a footnote, when he admits that there isn’t a critic aggregator for books like there is for films in Rotten Tomatoes. But he falters in then assuming that there isn’t a like to like comparison he can make. IMDB also has user ratings, numbers generated by fans that are more like the Goodreads ratings than Metacritic averages.

The difference is important. Every few years you’ll see a think piece like this one about how film critics don’t appear to have much influence on what movies people go see. They routinely trash the kind of big summer popcorn movies that do billions of dollars in business. It doesn’t suggest that the critics are wrong or right, just that they have a different frame of reference than fans. A critic sometimes sees a dozen movies in a week, whether they appeal to her likes and dislikes or not. The typical movie goer, on the other hand, sees one or two and tends to pick stuff he thinks he’ll enjoy. So comparing book fans to movie fans would have been a baseline for a survey like this.

Another problem, that Silver doesn’t approach at all, is one of scale. Put simply, even really popular books are read by orders of magnitude fewer people than see adaptations of them. Going back to Up in the Air, how many people read that book? Yet it did more than $166 million dollars in box office. My thought is that book fans tend to be more passionate than movie fans, more invested in their favorites. Not to mention, if most of the people who see an adaptation never read the source material, how can they compare one to the other?

Ultimately, that’s all nitpicking. Trying to reduce art to numbers is a fool’s game. Fun to play sometimes, but ultimately like trying to light a cigarette in a hurricane.

UPDATE: Or, as James Poniewozik puts it in a New York Times write up of the best in television for 2015:

Art isn’t math.

Amen.

Advertisements

3 thoughts on “The Trouble Of Quantifying Art

  1. Rotten Tomatoes is an interesting example in this discussion. It’s useful because it doesn’t attempt to rate the films directly per se, but rather shows you the percentage of people who liked them. Still, though, you really need to read what the critics are saying rather than simply relying on the number. There are more than a few mediocre films with high Tomatometer scores.

    Every done anything with Good Reads? I haven’t gotten into it, but it may be a useful tool.

    Like

  2. That’s an interesting thought about Rotten Tomatoes. It doesn’t try to quantify like Metacritic, just comes down to a version of the old Siskel & Ebert thumbs up/down decision. But you’re right, you really need to read the reviews. Ideally, you read the same critics regularly and get to know what they like and how it matches up to what you like.

    Goodreads is a pretty good tool for reaching out to readers. I’ve done a couple of giveaways and I’m active (every now and then) in some of the groups.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s