So someone else needs to prove a negative, that there exists no possible technology that will "disrupt" cryocoolers by bringing them to an unspecified price/performance point by an unspecified time in the future to service an unspecified computing use-case?
> someone else needs to prove a negative, that there exists no possible technology that will "disrupt" cryocoolers
Look at the Wikipedia references for crycoolers [1]. Note the dates and volume. Now look at room-tempuerature superconductors [2].
1990 vs 2023. 5 vs 57. OP is arguing that a greater fraction of high-temperature superconducting research dollars might find purchase in improving the cryocooler than we presently spend.
> Now look at room-tempuerature superconductors [2].
As a tangent: Don't forget to look at pressures too. Some newer superconductors that are near-room-temperature aren't quite as exciting when it turns out that requires over 1.5 million times normal atmospheric pressure. ("Hey, I think I over-tightened the CPU heatsink...")
The problem is, cryocooler theory is pretty well established and "solved at this point, so there is no reason to expect something completely new phenomena there, just some engineering improvements. Solid-state physics, on the other hand, is just computationally infeasible to "solve", so there is a plenty of possibility to discover something unpredicted.
The latter aren't being researched for CPUs, which are small things. They're for applications like long-distance power transmission, electric motors, and more. Things which aren't feasible to cryocool.
IIRC unlikely to change quickly even with higher-temp superconductors, since it would mean splicing in new power-grid segments that transfer direct-current instead of alternating, and then you have losses in conversion too.
That seems more than a little unfair. :p