- probabilistic algorithms (like all the modern bloom filter variants or skip lists or treaps)
- parallel algorithms
- cache oblivious (or just aware) algorithms. This made a lot of difference in designing trees or graphs.
- modern hashing (like cuckoo or robin hood hashing), or min hashing
- data mining algorithms
...
Are you sure those are not covered? I know he covers Bloom's filter. And half his treatment on trees is how frequency of access matters on tree creation. (He specifically does not advise just balanced trees.)
All these things only came out after 2000. Basic bloom filter is earlier, but modern bloom filters (and varieties like cuckoo filter) have improved quite a bit. The standard paper on modern bloom filters came out in 2006.
It's also true for other areas. Let's take sorting. The best general purpose comparing algorithm is timsort. timsort came out in 2002.
Keep in mind TAOCP was originally published in the early 70ies (first book came out in 68) and even though there were some new editions (upto late 90ies) they didn't really change all that much.
You can just scan over TAOCP's bibliography and see the average year of the cites. It is usually 60ies.
Claiming that CS has not improved since the 60ies is fairly absurd.
I won't claim it hasn't improved. I will claim there are a lot of rediscoveries.
Still, your point is fair. The section on secondary key retrieval is very high level. Bloom filters get just two paragraphs.
I think the biggest change in cs is the view towards memory. In much of TAOCP, care is given to the memory footprint. Much of day to day programming is completely ignorant of memory. Scarily so.