If you like VPS, Hetzner with Dokploy. It works great, the UI has essentially all the features of Fly or Render that you'd use for deployment, like preview build URLs and environments.
Eh, no, depends on why you used Heroku in the first place. Way back when, I used it because the UI was dead simple and it Just Worked™. If I can replicate that with a VPS and have a good UI around it that takes care of everything, it's functionally the same to me.
Heroku was one of the first to have that seamless UX, only after which others like Fly or Render or Railway came to copy it. I wager people were primarily attracted to that user experience and only minimally cared that it was fully hosted versus not, because there was also AWS at that time.
Having used Heroku at multiple startups during the 2012–2015 years, this is not correct.
With heroku you could `git push heroku master` and it would do everything else from there. The UX was nice, but that was not the reason people chose it. It was so easy compared to running on EC2 instances with salt or whatever. For simple projects, it was incredible.
That's literally the UX I'm talking about and that's what other companies copied too. To be clear, I'm not (just) talking about how heroku.com looks and works, I'm talking about the entire user experience including git push to deploy, so I believe you are agreeing with me here. That is why I said VPS with Dokploy or Coolify and so on have the same UX, both in the command line with git push deploys supported as well as (now, at least) a vastly superior website user experience, akin to Vercel.
Dokku is better. And neither is what Heroku's bread and butter customer needs.
But alas, my interest in painstaking explaining why self-hosting is fundamentally incompatible with a product who's value prop was "nothing to install" is waning.
You and I simply have different opinions on what Heroku's value proposition was, because, again, AWS was also right there and also was "nothing to install." Therefore Heroku was used primarily for its dead simple UX, something which is replicated even in a self-hosted environment, because, again, the value prop was never about PaaS or self-hosting, it was always about the user experience.
There are some people on r/LocalLlama using it [0]. Seems like the consensus is while it does have more unified RAM for running models, up to half a terabyte, the token generation speed can be fairly slow such that it might just be better to get an Nvidia or AMD machine.
You should look into Kilo Pass by Kilo Code (https://kilo.ai/features/kilo-pass). It's basically a fixed subscription and your credits roll over each month, and you get free extra credits too which are used up first before paid credits. It's similar to paying for Cursor except the credits roll over which is why I'm contemplating moving to it, because I don't want to be locked into any one LLM provider the way Claude Code or Codex make you become.
I was wondering how KiloCode Kilo Pass pricing compared to OpenRouter's top-up pricing, and did some digging and discovered the main difference is that OpenRouter provides a standard API key (sk-or-...) that works in any application (LangChain, curl, your own Python apps), while Kilo Pass credits are tied to the Kilo Gateway, which is designed to power the KiloCode Extension (VS Code/JetBrains) and CLI.
KiloCode does not appear to allow you generate a "Kilo API Key" to use in your external Python scripts or third-party apps. But the monthly bonus credits are sweet.
reply