Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
bbotond
on July 6, 2023
|
parent
|
context
|
favorite
| on:
GPT-4 API General Availability
What I don’t understand is why is an API needed to create embeddings. Isn’t this something that could be done locally?
pantulis
on July 6, 2023
|
next
[–]
You would need to have a local copy of the GPT model, which are not exactly OpenAI's plans.
jerrygenser
on July 6, 2023
|
parent
|
next
[–]
For embeddings, you can use smaller transformers/llms or sentence2vec and often get good enough results.
You don't need very large models to generate usable embeddings.
pantulis
on July 7, 2023
|
root
|
parent
|
next
[–]
You are correct, I assumed parent was referring to specific embeddings generated by OpenAI LLMs.
thorum
on July 6, 2023
|
prev
|
next
[–]
It’s cheaper to use OpenAI. If you have your own compute, sentence-transformers is just as good for most use cases.
teaearlgraycold
on July 6, 2023
|
prev
|
next
[–]
Yes. The best public embedding model is decent, but I expect it’s objectively worse than the best model from OpenAI.
merpnderp
on July 6, 2023
|
prev
[–]
Sure, but I don't know of any models you can get local access to that work nearly as well.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: