Speaking for music best-of lists, as compiled from votes of critics: I've found these lists are not very helpful in finding music I actually care about. It's frustrating, because there seems to be so much utility there.
I've explained it like this: To get on the list, something has to be considered at-least-good by a lot of people, and this tends to reward (1) herd mentality; (2) lowest-common-denominator. The list selects against anything in any niche, even when it's excellent.
As I look back through music that has meant a lot to me, there is just not much overlap with best-of lists.
Note: I'm talking about critics-vote, pooled, best-of lists. Single-critic best-of lists don't average out niche tastes and, for the right critic-listener match, can be very helpful indeed.
Moreso for music than books, but I personally find that most of my favorites tend to have at least as many people that hate it as love it.
Neutral Milk Hotel's "In the Aeroplane Over the Sea", for example, is an extremely polarizing album. Between Mangum's nasally vocals, beginner-to-intermediate technical talent, the inclusion of saws and theramin as instruments; it all adds up to a love/hate affair. I'm not a big fan of any of Mangum's other works, but AOtS has an allure that is just... indescribably gripping.
I don't go around recommending it, but when it comes up in my playlist, I find that I am simply compelled to stop what I'm doing and listen to the entire album, which is thankfully short, as far as albums go.
Thank you. Haven't thought about that album for many years, but now that you brought it back to my attention, I'll listen to it later today - which nicely supports the point you're trying to make.
I remember a counter-example to this (which may be a fluke) a few years ago when Burial got album of the year on metacritic for a spacey, ambient dubstep record. Maybe not super-niche but definitely not mainstream.
It's not necessarily bad, but "best" books should be judged on rating, not recognition. With this rating system a book with 1000 reviews rated at 3.5 stars would score higher than a book with 500 reviews rated at 4.5 stars.
I used to balk at anything on the "best-sellers' list," assuming the majority would be poorly-written beach read material, but I realized that there can be compelling reasons to read beyond the literary merit of a particular work. Pop literature can help us understand and analyze the culture we're living in, tap into the zeitgeist, participate proactively. It can lead to moments of cognitive dissonance, for sure, but that's healthy.
there's a phycological effect (I don't remember if there's a name for it) where we tend to value or rank more favourably things that are popular or familiar. Tests have been done with groups classifying newly heard songs with and without knowing a fake ranking and the group exposed to the ranking tended to follow it.
Yeah it's like how there always seem to be a few too many films from the last 5 years in "top 10 best films of all time" lists when the public vote for them.
Goodreads rating system is useless anyway. It's 'out of five' which means most people will give a good book a 4 or a 5 star - it's not like IMDb's more nuanced 'out of ten' rating.
They also display a bespoke meaning for the stars (i.e. 3 doesn't mean average), so for those that follow that meaning, their ratings will be different.
I find it's still a fairly good indicator. Last time I was looking for a new author I went back and compared the author's average rating. Authors I like were 4+. Authors I think are OK scored about 3.5. Authors who I won't read again were generally 3 or so.
Perhaps 'useless' is a strong word. It's good for getting a general idea but too many books hover around the 4 mark. Whereas with IMDb anything 6 and above is worth the risk.
Do you have any evidence of that, or are you just saying 'of course Amazon must have done something'? I use goodreads and haven't noticed any changes that would cause that.
I have - mainly for Audible - I've noticed some commenters noting how there's only interest in a given book (in this case, the Arawn Cycle) because Audible did a sale on the series.
Most people aren't active readers. Goodreads users are active readers. Amazon bought Goodreads (and other sites like Shelfari) because that's where the readers are.
I read 74 books so far this year. Most of them were purchased from Amazon.
Also, Goodreads has an active review community, and good "by reader" data for recommendations (as opposed to Amazon's "by account" data. If Amazon were smart, they'd be working very hard on improving recommendations with their Goodreads data.
I haven't read it, but interesting that "Fates and Furies: A Novel" a 3.5 star novel on Amazon makes the top 20 list. Is this purely a function of sales, not ratings?
Those are editors' picks - unrelated to customer sales or ratings. This is a pretty common phenomenon in the world of literary fiction where (a) some are clearly just disappointed that the book in question isn't an easily digestible genre work and (b) many people are inclined to be especially vociferous about their opinion of something they've sunk 8 - 20 hours into.
FWIW, I thoroughly enjoyed Fates and Furies in audiobook format, although it isn't without it's flaws. Critic James Wood gave it a tepid review in The New Yorker.
Goodreads https://www.goodreads.com/choiceawards/best-books-2015
Amazon http://www.amazon.com/b?ie=UTF8&node=13108091011
The Washington Post https://www.washingtonpost.com/graphics/entertainment/best-b...
The New York Times http://www.nytimes.com/2015/12/06/books/review/100-notable-b...
The Economist http://www.economist.com/news/books-and-arts/21679439-best-b...