Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What I don’t understand is why is an API needed to create embeddings. Isn’t this something that could be done locally?


You would need to have a local copy of the GPT model, which are not exactly OpenAI's plans.


For embeddings, you can use smaller transformers/llms or sentence2vec and often get good enough results.

You don't need very large models to generate usable embeddings.


You are correct, I assumed parent was referring to specific embeddings generated by OpenAI LLMs.


It’s cheaper to use OpenAI. If you have your own compute, sentence-transformers is just as good for most use cases.


Yes. The best public embedding model is decent, but I expect it’s objectively worse than the best model from OpenAI.


Sure, but I don't know of any models you can get local access to that work nearly as well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: