Hacker News new | past | comments | ask | show | jobs | submit login

Heuristics are annoying and data-dependent algo changes are dangerous.

Completely different example, but limits in Splunk aggregations - it means you can run your report on small data, but when you scale it up (to real production data sizes, maybe), then suddenly you get wrong numbers, and maybe results like "0 errors of type X" when the real answer is that there are >=1 errors. Because one of the aggregations used has a window size limit that it is silently applying. This stuff is dangerous.

What Splunk was doing for me would be the equivalent of an SQL join giving approximate answers when the data is too big.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: