Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've been using GPT-5 through the API and the response says 5000 tokens (+4000 for reasoning) but when I put the output through a local tokenizer in python it says 2000. I haven't put time into figuring out what's going on but has anyone noticed this? Are they using some new tokenizer?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: