Some operations are faster on linked list than on vectors (insert on beginning, or on specified iterator position, delete iterator position). They also use less memory in some use cases. I was once optimizing transactional tree structure which internally used std::vector in each node. In most real use cases there was just one item in each vector, but vector preallocated 16 items. So changing to linked list lowered memory consumption significantly without measurable performance impact. The same applied for hash table vs. RB tree. All basic data structures can be useful, you just have to choose proper one.
Storing strings as a linked list of characters in Haskell was a design mistakes imho, which lead to the proliferation of various efficient string types implemented in libraries.
IMHO, linked lists can be useful in conjunction with other data structures without hurting performance too much. An open hash table with a linked list in each bucket need only involve pointer chasing in case of collisions, which can be made arbitrarily rare by making the number of buckets large enough. In this way, there's a graceful performance degradation under unexpectedly heavy loads rather than hard limitations. I would be interested to read well informed contrasting opinions.
I tried to appreciate your response, but I cannot fairly judge the code. For example, those could be performance uncritical parts of code or remnants of 20 year old code where the cache was less effective than now. Are 2000 search results a lot in a source of this size?
Cache, as opposed to 20+ years ago, or whenever the pattern was developed, is indeed a thing that everyone is using just now.
How on earth can you make such a statement about a fundamental data structure? Please, take CS101 before you you start throwing these kind of assertions around.