Appreciate the verge link. The post link doesn’t really bring new info to the conversation until the trial.
It doesn’t seem like there are good answers on how to resolve this issue sadly. I wish there was more transparency on how much these social media giants are spending on this issue, how much is removed per year and average time the content was accessible. It’s hard to judge on the outside the size and scope of this problem and the resources being thrown at it. And that doesn’t even get into the psychological issues that this has to do for the reviewers. This is truly some of the darkest stuff that humanity has and scale makes it hard to resolve.
> It doesn’t seem like there are good answers on how to resolve this issue sadly.
Actually, the article highlights the internal working group of 84 employees that spent months evaluating Twitter's ability to suppress this exploitive material , concluding that Twitter had major problems in this area and submitting a list of recommendations to fix them that were mostly ignored.
> Aside from enabling in-app reporting of CSE, there appears to have been little progress on the group’s other recommendations.
> Employees say that Twitter’s executives know about the problem, but the company has repeatedly failed to act.
It doesn’t seem like there are good answers on how to resolve this issue sadly. I wish there was more transparency on how much these social media giants are spending on this issue, how much is removed per year and average time the content was accessible. It’s hard to judge on the outside the size and scope of this problem and the resources being thrown at it. And that doesn’t even get into the psychological issues that this has to do for the reviewers. This is truly some of the darkest stuff that humanity has and scale makes it hard to resolve.