Hacker News new | past | comments | ask | show | jobs | submit login

Have you tried messing with the target perplexity? That seems to make a big difference.



Yeah, I've tried from .5 to 50--3 seems to be the best value currently. I may try it with the non Barnes-Hut implementation too, maybe on a smaller number of words. If that works, then I'll bet it is something related to how the quadtree is constructed...

Btw, nice LDA visualization!




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: