Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, one more precise way to phrase this is that the expected value of the dot product between two random vectors chosen from a vector space tends towards 0 as the dimension tends to infinity (I think the scaling is 1/sqrt(dimension)). But the probability of drawing two truly orthogonal vectors at random (over the reals) is zero - the dot product will be very small but nonzero.

That said, for sparse high dimensional datasets, which aren't proper vector spaces, the probability of being truly orthogonal can be quite high - e.g. if half your vectors have totally disjoint support from the other half then the probability is at least 50-50.

Note that ML/LLM practioners use "approximate orthogonality" anyway.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: