I find the terminology is used inconsistently*, so it's probably always worth asking.
To me, a "large language model" is always going to mean "does text"; but the same architecture (transformer) could equally well be trained on any token sequence which may be sheet music or genes or whatever.
IIRC, transformers aren't so good for image generators, or continuums in general, they really do work best where "token" is the good representation.
* e.g., to me, if it's an AI and it's general then it's an AGI, so GPT-3.5 onwards counts; what OpenAI means when they say "AGI" is what I'd call "transformative AI"; there's plenty of people on this site who assert that it's not an AGI but whenever I've dug into the claims it seems they use "AGI" to mean what I'd call "ASI" ("S" for "superhuman"); and still others refuse to accept that LLMs are AI at all despite coming from AI research groups publishing AI papers in AI journals.
To me, a "large language model" is always going to mean "does text"; but the same architecture (transformer) could equally well be trained on any token sequence which may be sheet music or genes or whatever.
IIRC, transformers aren't so good for image generators, or continuums in general, they really do work best where "token" is the good representation.
* e.g., to me, if it's an AI and it's general then it's an AGI, so GPT-3.5 onwards counts; what OpenAI means when they say "AGI" is what I'd call "transformative AI"; there's plenty of people on this site who assert that it's not an AGI but whenever I've dug into the claims it seems they use "AGI" to mean what I'd call "ASI" ("S" for "superhuman"); and still others refuse to accept that LLMs are AI at all despite coming from AI research groups publishing AI papers in AI journals.