20 years is a long time. For some fields, it is perfectly reasonable, but 20 year patents on many recent CS inventions would have significantly bottlenecked development of the industry - Look at how much mess was created by the JPEG patents, for example, and similar problems have existed for every other not-explicitly-libre A/V codec.
While we're on the subject of patents and Nvidia, their patent on using quasi Monte Carlo in rendering is allowing them to hold basically the whole path tracing world hostage, e.g. possibly forcing people to use CUDA who might otherwise have used OpenCL.
They didn't even invent the numerical methods themselves (pure mathematics from other countries from long ago), they were just first to file for a particular application.
There's a strong adverse selection effect, though. Because you need to publish to be granted a patent but can sue whenever anyone infringes (whether willful or not), the incentive is to patent obvious approaches that don't work well and hold the best approach that you're actually using as a trade secret. That way, anyone attempting to replicate you likely ends up in a patent minefield, yet you don't give away the keys to the castle in a patent where you have to detect infringement yourself.
I believe a few years is 20 years though. I haven't thought of patents from this perspective but 20 years is still a long time (and large chunk of your working years) to benefit from something.
In a few years, the temporary monopoly falls away and the benefit passes to everyone.
I think they should work to make them even cheaper and easier to file.