Of course it can but I think the issue is that people may try to jailbreak it or do something funny to get a weird response, then post of x.com against the company. There must be techniques to turn LLMs into a FAQ forwarding bot, but then what's the point of having a LLM