Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's the reason it is expected to have 5 sigma, so that anomalies are extremely rare. The more experiments one makes, the closer one gets to the mean value and this happens exponentially fast. By that, I mean that using some probability theory and some algebra you can get a confidence bound on the accuracy you demand based on the number of experiments that you run.


I think this happens sublinearly w.r.t sample size.


I would like to point you towards Hoeffding's inequality. The probability of deviating more than epsilon from the mean after n experiments is exponential in n.


I think you're talking about different things.

For a fixed range, the probability of being outside that range decays like 1/k^n.

But for a fixed probability, the width of the range with that probability decays like 1/sqrt(n).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: