Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sorry if i'm completely missing it, I noticed in the code, there is something around chat:

https://github.com/tembo-io/pg_vectorize/blob/main/src/chat....

This would lead me to believe there is some way to actually use SQL for not just embeddings, but also prompting/querying the LLM... which would be crazy powerful. Are there any examples on how to do this?



There is a RAG example here https://github.com/tembo-io/pg_vectorize?tab=readme-ov-file#...

You can provide your own prompts by adding them to the `vectorize.prompts` table. There's an API for this in the works. It is poorly documented at the moment.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: