Indeed, let's not forget that Google's BERT model was a very hot topic a few years ago, and their in-house researchers literally invented the basis of all modern language modeling, word2vec. Maybe they've been resting on their laurels, but with all the growing hype around GPT models (even before ChatGPT), I'd be surprised if nobody at Google was already working on this stuff.