I think local models will be more and more relevant. First of all, the locally available computing power for inference is going up. In addition, inference is much cheaper than training. Finally, there are diminishing returns with more and more parameters.
Yes and if we start chatting to a local AI model instead of searching on Google, that means we get an overall increase in privacy. So the new local AI era could actually be much more private than the previous Web era - an era in which we could have much more freedom.
Local models also solve a lot of the trust issues