The most funny thing is how synchronicity worked its magic:
Roo Code experimental code indexing using vector DB dropped 3 days ago. Theire using Tree-sitter (the same as Aider) to parse sources into ASTs and do vector embedding on that product, instead of plaintext.
Can you add a recent build of llama.cpp (arm64) to the results pool? I'm really interested in comparing mlx to llama.cpp, but setting up the mlx seems too difficult for me to do by myself.
I ran them again several times to make sure the results were fair. My previous runs also had a different 30B model loaded in the background that I forgot about.
LM Studio is an easy way to use both mlx and llama.cpp
Well, in that case, why not build a hybrid—a low-orbit satellite network synced to a few ground bases that will give the exact position of flying over satellite(s)?
Building a network of a very tall tower every 58km around a globe of the Moon seems very uneconomic.
To me it doesn’t look like a bug. I believe it is a intended “feature” pushed from high management - a dark patern to make plebs pay for answer that has overflowed the quota.
Don't jump on hyped technologies (like Kubernetes, or Kafka) from day 1. You would be surprised how much you can do with $5 VPS running PHP + Postgres.
Don't fancy your tech stack, shipp the bloody project!
reply