Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To the contrary, linked lists are cache killers and hardly used nowadays. Or so they say on here, if I understand correctly.


Some operations are faster on linked list than on vectors (insert on beginning, or on specified iterator position, delete iterator position). They also use less memory in some use cases. I was once optimizing transactional tree structure which internally used std::vector in each node. In most real use cases there was just one item in each vector, but vector preallocated 16 items. So changing to linked list lowered memory consumption significantly without measurable performance impact. The same applied for hash table vs. RB tree. All basic data structures can be useful, you just have to choose proper one.


Sweeping statements like this are almost always unilaterally false.

The best data structure or algorithm to use is usually problem-dependent.


Do you mean mine or the OP's, which evoked my response?

> ... why you'll want to use a linked list instead of just plain arrays


Linked lists are the default "goto" data structures for Haskell and many other functional languages, such as Lisp.


Storing strings as a linked list of characters in Haskell was a design mistakes imho, which lead to the proliferation of various efficient string types implemented in libraries.


IMHO, linked lists can be useful in conjunction with other data structures without hurting performance too much. An open hash table with a linked list in each bucket need only involve pointer chasing in case of collisions, which can be made arbitrarily rare by making the number of buckets large enough. In this way, there's a graceful performance degradation under unexpectedly heavy loads rather than hard limitations. I would be interested to read well informed contrasting opinions.


> hardly used nowadays

https://github.com/torvalds/linux/search?utf8=%E2%9C%93&q=IN...

That's unbearable. They should just redo this in JavaScript. That's what everyone uses nowadays. Or so they say on here, if I understand correctly.


I tried to appreciate your response, but I cannot fairly judge the code. For example, those could be performance uncritical parts of code or remnants of 20 year old code where the cache was less effective than now. Are 2000 search results a lot in a source of this size?

Cache, as opposed to 20+ years ago, or whenever the pattern was developed, is indeed a thing that everyone is using just now.



How on earth can you make such a statement about a fundamental data structure? Please, take CS101 before you you start throwing these kind of assertions around.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: