I'm not sure if this will impact the market the same way the R1 did. However, my general impression is that while Meta spent $20B on their 100k H100s, DeepSeek is demonstrating that you can achieve better results far more cost-effectively using just 2k H100s. This doesn't seem like good news for Nvidia, but it sets a great precedent for companies looking to train models.
It’s essentially as if the number of existing premium GPU chips were multiplied by 30x or 50x. Yes, when you 30x a supply of something, you are going to lower its price. The question is does this lowered price then increase the demand but that’s a lot more speculative than the supply impact, and could easily take much longer to be felt.