Roo has codebase indexing that it'll instruct the agent to use if enabled.
It uses whatever arbitrary embedding model you want to point it at and backs it with a qdrant vector db. Roo's documents point you toward free cloud services for this, but I found those to be dreadfully slow.
Fortunately, it takes about 20 minutes to spin up a qdrant docker container and install ollama locally. I've found the nomic text embed model is fast enough for the task even running on CPU. You'll have an initial spin up as it embeds existing codebase data then it's basically real-time as changes are made.
FWIW, I've found that the indexing is worth the effort to set up. The models are generally better about finding what they need without completely blowing up their context windows when it's available.
We really are living in the golden age of the terminal. I thought this would take a chunk out of Typescript/node marketshare of young coders, but i'm starting to see more and more of these animals building TUIs using nothing but npm packages.
Last week I built my own CLI coding agent tool using just nodejs and zero dependencies! It is a lot of fun to build, really, I think everyone should try it out