This would lead me to believe there is some way to actually use SQL for not just embeddings, but also prompting/querying the LLM... which would be crazy powerful. Are there any examples on how to do this?
You can provide your own prompts by adding them to the `vectorize.prompts` table. There's an API for this in the works. It is poorly documented at the moment.
https://github.com/tembo-io/pg_vectorize/blob/main/src/chat....
This would lead me to believe there is some way to actually use SQL for not just embeddings, but also prompting/querying the LLM... which would be crazy powerful. Are there any examples on how to do this?