Hacker Newsnew | past | comments | ask | show | jobs | submit | delijati's commentslogin

Interested. I'm on linux now for 20 years but i never heard of bubblewrap :D. I currently run OpenCode in Docker but i always assumed there was a better way. So bubblewrap and your script seams like the perfect fit.


I have now updated the above to add my OpenCode script. Hope it helps!

time in the market beats timing the market -> Kenneth Fisher ... i learned it the hard way ;)


> In contrast to the transient velocity gains, Cursor adoption shows more sustained patterns across static analysis warnings and code complexity, with evidence of sustained technical debt accumulation.


Is there a cli like gemini-cli but for mistral ... yes i know aider


Opencode.


Do you maybe know if there is a company in the EU that hosts models (DeepSeek, Qwen3, Kimi)?


Most inference companies (Synthetic included) host in a mix of the U.S. and EU — I don't know of any that promise EU-only hosting, though. Even Mistral doesn't promise EU-only AFAIK, despite being a French company. I think at that point you're probably looking at on-prem hosting, or buying a maxed-out Mac Studio and running the big models quantized to Q4 (although even that couldn't run Kimi: you might be able to get it working over ethernet with two Mac Studios, but the tokens/sec will be pretty rough).


uhh nice do you know what model they are using ?


Does anyone knows if there is a hoster for Kimi K2, Qwen3 or Deepseek V3/R1 or so in the EU?



Plugins! I completely missed that when testing this earlier. Thank you, will have to take another look at it.


there is nothing lite in litellm ... i was experimenting (using as a lib) but ended using https://llm.datasette.io/en/stable/index.html btw. thanks @simonw for llm



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: