Hacker News new | past | comments | ask | show | jobs | submit login

I only briefly browsed the code, but this seems to be roughly what yelp/detect-secrets does.

Anyway, that doesn't really answer my question. To summarize answers in this thread, I think PhilipRoman has captured the essence of it: strictly speaking, the idea of entropy of a known string is nonsense. So, as I suspected, information theory definition isn't meaningfully applicable to the problem. And as other commenters like you mentioned, what we are really trying to measure is basically Kolmogorov complexity, which, strictly speaking, is incomputable, but measuring the compression rate for some well-known popular compression algorithm (allegedly) seems to be good enough estimate, empirically.

But I think it's still an interesting linguistic question. Meaningful or not, but it's well defined: so does it appear to work? Are there known constants for different kinds of text for any of these (or other) metrics? I would suspect this should have been explored already, but neither me, nor anybody in this thread apparently has ever stumbled upon such article.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: