Does Nvidia have integrated memory options that allow you to get up to 64GB+ of VRAM without stringing together a bunch of 4090s?
For local LLMs Apple Silicon has really shown the value of shared memory, even if that comes at the cost of raw GPU power. Even if it's half the speed of an array of GPUs, being able to load the mid-sized models at all is a huge plus.
It also costs 4x the entire Framework Desktop for just the card. If you're doing something professional that's probably worth it, but it's not a clear winner in the enthusiast space.
For local LLMs Apple Silicon has really shown the value of shared memory, even if that comes at the cost of raw GPU power. Even if it's half the speed of an array of GPUs, being able to load the mid-sized models at all is a huge plus.