Hacker Newsnew | past | comments | ask | show | jobs | submit | satvikpendem's commentslogin

From a human, to a centaur, to a pegasus, as it were.

You're too late, businesses absolutely did ask that before 2020 when the hype was at its peak.

Regular old boring profitable f500 enterprises?

Lots of examples, such as Walmart and IBM's collaboration: https://www.researchgate.net/publication/326188675_Food_Trac...

It's a little too late for that, all the models train on prompts and responses.

Not to be confused with builder.io, or worse, builder.ai

If you like VPS, Hetzner with Dokploy. It works great, the UI has essentially all the features of Fly or Render that you'd use for deployment, like preview build URLs and environments.

Very close to the worst alternative for people who actually need Heroku, but it won't stop people from plugging it to death and back.

Eh, no, depends on why you used Heroku in the first place. Way back when, I used it because the UI was dead simple and it Just Worked™. If I can replicate that with a VPS and have a good UI around it that takes care of everything, it's functionally the same to me.

"Depends on what you used it for" applies to just about any platform.

Realistically, self-hosting the PaaS defeats the purpose of a PaaS for the crowd Heroku was attracting.


Heroku was one of the first to have that seamless UX, only after which others like Fly or Render or Railway came to copy it. I wager people were primarily attracted to that user experience and only minimally cared that it was fully hosted versus not, because there was also AWS at that time.

Having used Heroku at multiple startups during the 2012–2015 years, this is not correct.

With heroku you could `git push heroku master` and it would do everything else from there. The UX was nice, but that was not the reason people chose it. It was so easy compared to running on EC2 instances with salt or whatever. For simple projects, it was incredible.


That's literally the UX I'm talking about and that's what other companies copied too. To be clear, I'm not (just) talking about how heroku.com looks and works, I'm talking about the entire user experience including git push to deploy, so I believe you are agreeing with me here. That is why I said VPS with Dokploy or Coolify and so on have the same UX, both in the command line with git push deploys supported as well as (now, at least) a vastly superior website user experience, akin to Vercel.

How do you think self-hosting affects that seamless UX they value.

As I said, the correct software on top handles it all for you. I don't think you've actually tried Dokploy.

Dokku is better. And neither is what Heroku's bread and butter customer needs.

But alas, my interest in painstaking explaining why self-hosting is fundamentally incompatible with a product who's value prop was "nothing to install" is waning.

Have a good one.


You and I simply have different opinions on what Heroku's value proposition was, because, again, AWS was also right there and also was "nothing to install." Therefore Heroku was used primarily for its dead simple UX, something which is replicated even in a self-hosted environment, because, again, the value prop was never about PaaS or self-hosting, it was always about the user experience.

Have a good weekend.


Ok so I am researching what to use in this space - a Vercel-ish clone on cheap VPS - and, is Dokploy really the best option?

What do you think about Caprover? https://github.com/caprover/caprover

Or uh.. Dokku https://github.com/dokku/dokku

Right now I am using Coolify but so far it has not been exactly reliable


I don't like their UIs, Dokploy's is far more modern. And yes Coolify is not known to be very reliable, especially because it's built on PHP.

Check out these videos:

https://youtu.be/ELkPcuO5ebo

https://youtu.be/RoANBROvUeE


Claude Code is made with Anthropic's models and is very commercially successful.

Something besides AI tooling. This isn't Amway.

Since they started doing that it's gained a lot of bugs.

Should have used codex. (jk ofc)

How's the inference speed? What was the price? I'm guessing you can fit the entire model without quantization?

See also, PGLite: https://pglite.dev/

And Turso: https://turso.tech/


No multi-writer support.

There are some people on r/LocalLlama using it [0]. Seems like the consensus is while it does have more unified RAM for running models, up to half a terabyte, the token generation speed can be fairly slow such that it might just be better to get an Nvidia or AMD machine.

[0] https://old.reddit.com/r/LocalLLaMA/search?q=mac+studio&rest...


Thanks for the link. I'll take a look.

You should look into Kilo Pass by Kilo Code (https://kilo.ai/features/kilo-pass). It's basically a fixed subscription and your credits roll over each month, and you get free extra credits too which are used up first before paid credits. It's similar to paying for Cursor except the credits roll over which is why I'm contemplating moving to it, because I don't want to be locked into any one LLM provider the way Claude Code or Codex make you become.

I was wondering how KiloCode Kilo Pass pricing compared to OpenRouter's top-up pricing, and did some digging and discovered the main difference is that OpenRouter provides a standard API key (sk-or-...) that works in any application (LangChain, curl, your own Python apps), while Kilo Pass credits are tied to the Kilo Gateway, which is designed to power the KiloCode Extension (VS Code/JetBrains) and CLI. KiloCode does not appear to allow you generate a "Kilo API Key" to use in your external Python scripts or third-party apps. But the monthly bonus credits are sweet.

Yes, it's for development not deployment.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: