The Arch Linux developers state that they expect an 0.8% you increase in package size but an 1300% speedup in decompression. Not too shabby.
I'm running Arch on my personal system and it's really noticable, especially when I create by own packages, compression doesn't take longer than compiling anymore.
There definitely exist certain kinds of data which xz can compress better or as well but faster than zstd, even taking into account the more extreme compression options offered by zstd. I know because we have many terabytes of such data, and I've done thorough comparisons between xz and zstd at all different compression levels.
However in all cases zstd is much faster for decompression. It just happens that getting the best compression ratio in a not insane amount of time is still the better tradeoff for us.
Edit: Arch Linux switched from xz to ZSTD for their package manager and somebody compared both: https://sysdfree.wordpress.com/2020/01/04/293/
The Arch Linux developers state that they expect an 0.8% you increase in package size but an 1300% speedup in decompression. Not too shabby.
I'm running Arch on my personal system and it's really noticable, especially when I create by own packages, compression doesn't take longer than compiling anymore.