Hacker News new | past | comments | ask | show | jobs | submit login

But it’s not just voting. They bias what gets put in front of people, and what doesn’t.

It’s a bit like saying a drug dealer has no accountability because it’s the user who buys the drugs. In this case FB is well aware of the consequences of their actions, and even worse, their total inability to do anything when it’s not in a dominant language. This has had profound consequences in places like Myanmar and India where disinformation spreads like wildfire and leads to the killing of people. You’d think they’d step back and go “whoa, this isn’t what we intended for FB to be” and do something about it. The opposite has happened.




> They bias what gets put in front of people, and what doesn’t.

What came first though, the chicken or the egg?

What came first, the content in an unbiased fashion and then the algorithms were trained on which content generated best engagement and it got promoted

or

did Facebook just behind the scenes start pushing bad content on a whim with no backing information? I don't think that happened




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: