I consider Tyler Cowen a fairly brilliant guy, and he's influenced a lot of how I think about various issues. He's an economist, a gamer, and an academic, so he has a hell of a start endearing himself to me.
Because I hold Mr. Cowen in such high esteem, I was surprised to read his post today, which criticized Spin's move from a 5 point ranking system to a 10 point scale.
I almost exclusively read film reviews through the filter of Metacritic, which converts all the written reviews to a 100 point scale. I find the small scales typical of newspapers uninformative. There are a lot of 4/5 star films, or albums, or restaurants. That represents a huge range of quality, and I often want more information about items in that range.
A commenter suggests that these expanded scales are only good some of the time,
More granularity is useless if you are making no comparisons and just making a pure recommendation: Go see Coraline. Read The Hobbit. Don't read The Stand. Etc.- Bob Montgomery
At first blush, that seems plausible. If a friend of mine asks me "Should I read Down and Out in the Magic Kingdom?", and I reply, "87!" then I can see where that would be confusing.
If I replied, "I'd peg the chance you like it at 87%," I can see that as being a weird, resulting from questionable methodology, perhaps, but not less helpful than "Yes." Certainly not useless, as the commenter suggests.
I began with a commenter, because I worry that Mr. Cowen's thoughts here are... less developed.
"But say they give a new release eight, nine, or who knows maybe eight and a half stars? What exactly are they trying to say?"
This means that the album in question is better than a 7 star album, but worse than a 10 star album. I'm not sure where the difficulty lies in understanding such a system. Numbers mean what they always mean. This is not Crazy 8s.
I think that Cowen is trying to suggest that the main function of any rating scale is for the magazine to "put it's name on the line" with it's maximal review. A five-star scale then lets the magazine really go out on a limb for a lot of albums, where a ten-star scale lets a magazine softball some truly great works at a 9, reserving 10s for "safe bets".
This is not my intuition of how reviewing systems work. Probably due my experience with video game ratings. The video game magazine industry, for a long period of time, rated almost all videogames in the 7-9 range. This ruined their credibility for me and others. The fact these were basically trade magazines that shilled for the mainstream wasn't somehow disguised by the 10 point scale.
I now see reviewers as putting their name on the line with everything they write, and I'm no longer interested in giving them opportunities to hide their opinions in broad, imprecise scales.