Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Mastodon's culture of anti-everything is naive. All posts (except "DMs") are public and can be scraped and made searchable at will for anyone mildly motivated to do so.

It's the opposite of naive: it's extremely well thought out and heavily deliberated. Making so many things "public" by default is an invite to people. It's an intentional welcome mat in old school "Internet 1.0" sort of way. But just because you want to welcome people doesn't mean you have to welcome robots (crawlers, etc). Many instances do that deliberately in a very old school "Internet 1.0" way by saying so in their ROBOTS.TXT file (in addition to other places).

In the old web, crawlers were expected to read ROBOTS.TXT and no matter how "public" they thought the website was they found, ROBOTS.TXT was supposed to be the final word.

Anyone scraping or making searchable "at will" random chunks of the Fediverse is easily violating some number of ROBOTS.TXT files. That is an ancient technical convention that isn't new or naive. The internet knew even then that bad actors would ignore ROBOTS.TXT files. The old internet learned to name and shame the bad actors, and in some cases would back that up by force with firewall blocks and in some cases lawsuits. Mastodon does that too. That's why a lot of Mastodon instances are preemptively blocking Threads, because they don't trust Meta to follow good behaviors such as checking ROBOTS.TXT, because Meta hasn't shown a history of being a good actor there and because Thread's privacy policies seem to imply that they don't care to be a good actor for their own users (to the point of not supporting EU users at all because GDPR is "too hard"), so it makes it much harder to assume they will be good actors with respect to all of the conventions around Mastodon data including the classic ROBOTS.TXT.

The Mastodon culture of "public for people, but not for ROBOTS, or only select ROBOTS" is an ancient internet tradition. It's hard to call that naive, when it has decades of history and internet social norms (including good outcomes) behind it. What's naive is thinking that because some major corporations stopped respecting good social norms in the name of increased ad revenue that those norms no longer apply and "anything technically possible is allowable". Read the ROBOTS.TXT in the room and stop being motivated by technology for technology's sake without respecting ethics. Be a good actor in any ecosystem.



You and I agree. I've been on the web since 1996 and the credo you talk about is deeply ingrained in my ethics.

But it's still wishful thinking. We live in the age where AI is so bold as to scrape the crap out of even the largest of other big tech companies without blinking. Without permission, attribution, compensation. So surely a little Mastodon scrape isn't a problem.

There's no need to talk about how unethical it is, we agree. Problem is that it's hard if not impossible to stop. That what I mean by naive.


I don't think it is naive to believe and fight for ethics. I think it takes a lot of courage, especially in a time of disillusionment where you can often feel like the entire industry has lost its mind and put only the most unethical people in charge. I'd rather fight for ethics than say "we can't have nice things because no one is ethical". That takes guts.

I don't think it is is exactly "wishful thinking" to believe that the way we get back to promoting ethics in software is expecting people to behave ethically. We sure are doomed to be disappointed when people turn out to fail us, but that's all the more reason to fight for it, to remind people what ethics are and why a polite society needs them. All of those disappointments are teaching opportunities, if people are open to listening.

(Will Meta learn anything at all from all the Mastodon instances that have pre-emptively blocked them on ethics concerns? Who knows? Mastodon can teach, but it can't force the student to learn. Is it worth Mastodon trying and fighting to teach Meta, no matter what happens? I'd say yes. Ethics are as much a social construct. How we talk about them, how we try to teach them, that says a lot about who we are and what our ethics are.)

I'd rather have even the attempt at ethics than despair that "ethics are technically impossible to enforce". We know ethics can't be programmed, that's why we have to enforce them socially.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: