Don't think of spreadsheets as glorified tables. Think of them as the world's most-commonly used business logic and statistical programming language. A competitor to R, if you will.
Statistics, sure, that's definitely a good candidate for GPUs. I don't know much about R, but a quick google suggests you can run R code on a GPU, by working with certain object types, like matrices with GPU-accelerated operations.
That doesn't seem like it maps very well to a spreadsheet unless you have one big matrix per cell. I'm guessing (maybe incorrectly) that when people work with matrices in Excel, they're spread across a grid of cells. You probably could detect matrix-like operations and convert them to GPU batch jobs, but it seems very hard and I'm skeptical of how much you'd gain.
So I'm still wondering what kinds of typical Excel tasks are amenable to GPU acceleration in the first place. People use Excel to do a lot of surprising things, sure. But people use C++ and Python and Javascript for a lot of things too, and you can't just blithely move those over to the GPU.
Maybe it's specific expensive operations, like "fit a curve to the data in this huge block of cells"?