Hacker Newsnew | past | comments | ask | show | jobs | submit | rurban's commentslogin

And before that it was DJ's and "Dancers", btw.

Sure they are. Rawhide is the official community-driven rolling release for the latest of Fedora. If you want to get some package into Fedora (and thus RHEL), put it into Rawhide, not Terra.

Always Kamila! Her standing is higher than Fabrice Bellard for me. Because you can actually talk to her.

Excellent idea to promote US businesses. All our US customers are still holding their heads down and investing nothing. Canada, Australia and Netherlands are now at the top.

Poor Glass House.


You also got the Tesla keys, nice!

True. I had to kill my dynamic service of collected film festival ratings, because the bots drained the memory of the still available free hosters. I fought it for two years, with user agent and IP ranges, but eventually gave up. So I had to revert to static pages hosted on GitHub pages. The bots cannot kill that. But very limited features

Interestingly he left out the episodes with his second wife, the young insta model, which he was very proud of then. Kristina Basham (now married to another guy)

Similar thing happened to me at a surf competition at an anon place. People were flown in from everywhere, my surfboard arrived too late because the plane sucked.

But as me, everybody just gathered at the hot tub, nobody took part at the real sports event, we just fooled around and were happy to meet people from all around the world. Brazil, Hawaii, Germany, Canada, US. Really nice event. No idea if there even was a winner at the event. Maybe they did put something onto the web page, but nobody cared.


Of course not. Users love the chatbot. It's fast and easier to use than manually searching for answers or sticking together reports and graphs.

There is no latency, because the inference is done locally. On a server at the customer with a big GPU


> There is no latency

Every chat bot I was ever forced to use has built-in latency, together with animated … to simulate a real user typing. It’s the worst of all worlds.


> to simulate a real user typing

The models return a realtime stream of tokens.


This was already the case before LLMs became a thing. This is still the case for no-intelligence step by step bots.

Because they are all using some cloud service and external LLM for that. We not.

We sell our users a strong server, where he has all his data and all his services. The LLM is local, and trained by us.


vui.el, nice!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: