Hacker Newsnew | past | comments | ask | show | jobs | submit | lorantt's commentslogin

O(1) is usually prefixed with "amortized" or "average worst-case", which is where the difference in interpretation comes from imo. I found this explanation a bit wishy washy on some details but good (and has an appropriate appeal to authority/original sources).

https://m.reddit.com/r/compsci/comments/2z74z8/why_is_hashta...


Yup, pathological worst-case can be annotated with an omega Ω(n).


I think people here seem to implicitly assume linked buckets, those are bad on modern architectures for several reasons.

Look at (Hopscotch,) Robin Hood or Cuckoo Hashing for hashing with linear probing, high fill factor (~0.9) and _amortized_ O(1). I've seen a paper somewhere that proved Robin Hood worst-case O(log log n) afair.

http://www.sebastiansylvan.com/post/robin-hood-hashing-shoul...

https://en.m.wikipedia.org/wiki/Hopscotch_hashing

You can make open probing performantly concurrent with 2 bytes of memory overhead per key/value too:

http://preshing.com/20160314/leapfrog-probing/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: