LLMs might hallucinate, potentially misleading you and your customers.
LLMs might not stay on topic; they can get lost in the middle.
LLMs might not say 'I do not know' when they don't know.
LLMs may be vulnerable to prompt injecting attacks.
The chosen LLM might not be the best fit for your task.