Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This reminds me of a bias I saw on Quora, where some popular contributor gets tons of upvotes by fans regardless of whether their answer is any good or not (often you have to dig deep into the comments to find out that it's not a good answer). (I should qualify that I don't know if it's still that way; I started using it when it was first launched and then quit 6-7 years ago once it started turning into a fanboi and you-upvote-me-I-upvote-you club.)


It's probably inherent to any site that allows you to follow individuals. For Twitter that's not much of a problem because most people know better than to rely on Twitter for life advice or product reviews. Letterboxd, on the other hand, manifests exactly the problem you're talking about, and the whole point of the site is to assign purported ratings to film and TV shows.

Reddit and IMDb are mostly immune. Which is not to say that they're immune to all problems (astroturfing, brigading, personal biases, etc.).


Reddit seems to try and change that, though? I don’t really follow what development happens on new reddit, but it seems more user-oriented?


They have been _trying_ to make it more user oriented for a while but even in the new design, users aren't really that important or recognized any more than they always have.

The main change is you can post posts to your profile rather than a particular subreddit. But that's not a big feature. Back in the day people used to just create a subreddit of their username.


There was a study some time back where they set folks up in groups, then had them listen and vote on bands / music.

Each group trended towards catapulting a single band / musician, but it always just depended on which one in the group got the momentum first.

I wish I could find the study again but it's hard to google for.

Pretty eye opening.


Salganik MJ, Dodds S, Watts DJ Experimental Study of Inequality and Unpredictability in an Artificial Cultural Market

https://www.science.org/doi/10.1126/science.1121066


Thank you for finding this, I've been asked for the source before and could never dig it up!


It's not different from Reddit or H.N. in the sense that most people that vote never bothered to read the entire post and the same thing is true with peer review. This isn't simply the case in soft-science as the Bognanov affair showed.

They skim, see if it looks okayish and give it their approval.


HN doesn't allow you to follow users--that makes a big difference in putting the focus more on the content than the users.


Peer-reviewers also don't follow particular names around; they see the name above the submission and it influences them, as on H.N. no doubt.

But it's far worse, even without a recognized names, most votes are cast without proper reading, and I'm fairly certain also by the least intelligent subsection given how often submissions are upvoted on H.N. and Reddit that are pure clickbait and demolished in the comments by people that actually read it. People that vote by and large only read the title or the first sentence and make up their mind from there.


> Peer-reviewers also don't follow particular names around; they see the name above the submission and it influences them, as on H.N. no doubt.

Agreed. I was referring to Quora.


Well, that's because Quora has distribution mechanics like the home feed and email digests that can amplify content by popular contributors. It's not like Reddit or HN or StackExchange where answers/comments compete directly against each other on the question page. I don't think the situation is analogous.


It was worse on Digg, before it "pivoted" into whatever it is now.

https://www.wired.com/2012/07/mklopez-digg-power-user-interv...


You'll see this on SO where "rockstars" get their answer checked even though a better answer precedes theirs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: