But am I crazy or did the pre-production version of gemini-embedding-001 have a much larger max context length?
Edit: It seems like it did? 8k -> 2k? Huge downgrade if true, I was really excited about the experimental model reaching GA before that
But am I crazy or did the pre-production version of gemini-embedding-001 have a much larger max context length?
Edit: It seems like it did? 8k -> 2k? Huge downgrade if true, I was really excited about the experimental model reaching GA before that