Hacker Newsnew | past | comments | ask | show | jobs | submit | icarito's commentslogin

OP here, the actual content of the missing domain is here: https://somosazucar.github.io/www-blog/ for now


Hi all — I help manage somosazucar.org, one of the local volunteer groups of Sugar Labs, the nonprofit behind the open-source Sugar Learning Platform originally developed for the One Laptop Per Child (OLPC) project.

SomosAzúcar has supported open education and children’s digital literacy initiatives across Latin America since 2009.

The domain expired on 2025-10-06, but due to a Postfix configuration issue on sugarlabs.org, GoDaddy’s renewal notices never reached us.

By the time we discovered the problem — about 35 days after expiration — GoDaddy informed us that the domain was already being prepared for auction, and that the only way to recover it would be to bid for it like any other buyer.

It feels wrong that a long-standing nonprofit project could lose its .org domain over a technical mail glitch.

Has anyone here faced something similar with GoDaddy or other registrars?

Is there any way to appeal to PIR (.org registry) or GoDaddy executive support to restore the domain before it’s auctioned?

Any advice or contacts would be deeply appreciated — this domain represents more than 15 years of open education work.


First off you should get a better, less douchy domain provider than GoDaddy. I like namecheap.

You're better off just dumping them and changing domains don't put up with this kind of BS.

You could try suing them but they'd probably roll you


For all of you using `llm` - perhaps take a look at [Gtk-llm-chat](https://github.com/icarito/gtk-llm-chat).

I put a lot of effort into it - it integrates with `llm` command line tool and with your desktop, via a tray icon and nice chat window.

I recently released 3.0.0 with packages for all three major desktop operating systems.


This is pretty great.


Interesting. What do you use it for beyond the normal chatting


I sometimes use llm from the command line, for instance with a fragment, or piping a resource from the web with curl, and then pick up the cid with `llm gtk-chat --cid MYCID`.


I'm actually planning on abandoning Simon's infra soon. I want a multi-stream, routing based solution that is more aware of the modern API advancements.

The Unix shell is good at being the glue between programs. We've increased the dimensionality with LLMs.

Some kind of ports based system like named pipes with consumers and producers.

Maybe something like gRPC or NATS (https://github.com/nats-io). MQTT might also work. Network transparent would be great.


That's right this is a nontrivial problem that I struggled with too for gtk-llm-chat! I resolved it using the streaming markdown-it-py library.


Huh this might be another approach with a bit of effort. Thanks for that. I didn't know about this


Hey I felt bad that there was a longer delay and by making sure to lazy-load everything I could, I managed to bring down the startup time from 2.2 seconds to 0.6 on my machine! Massive improvement! Thanks for the challenge!


nice that's a huge difference


I've got a i5-10210U CPU @ 1.60GHz.

You triggered my curiosity. The chat window takes consistently 2.28s to start. The python interpreter takes roughly 30ms to start. I'll be doing some profiling.


I'd truly like to know! But I've no access to a Mac to try. If you can, try it and let me know? If it does, please send a screenshot!


Confirm it works with mac!

gtk-chat at least, having some issues with the notif lib for gtk-applet

screenshot: https://postimg.cc/KKxQNdG6


I wonder! In my more modest setup, it takes a couple of seconds perhaps. After that it's quite usable.


Yeah I agree, I've been thinking about using Rust. But ultimately it's a problem with GTK3 vs GTK4 too because if we could reuse the Python interpreter from the applet that would speed things up but GTK4 doesn't have support for AppIndicator icons(!).

I've been pondering whether to backport to GTK3 for this sole purpose. I find that after the initial delay to startup the app, its speed is okay...

Porting to Rust is not really planned because I'd loose the llm-python base - but still something that triggers my curiosity.


Yeah absolutely, I've just got to point where I'm happy with the architecture so I'll continue to add UI. I've just added support for fragments and I've thought to add them as if they were attached documents. I've in the radar to switch models in mid conversation and perhaps the ability to rollback a conversation or remove some messages. But yeah, system prompt and parameters would be nice to move too! Thanks for the suggestions!


Awesome. It would be great to see a nice gtk-based open source competitor to lm-studio and the like.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: