A Large Language Model is just a text predicting Transformer (and sometimes image and/or audio predicting if they're multimodal). Transformers are general sequence to sequence predictors. The only difference between this and GPT is the data it's been trained on. They're the same kind of neural network.