

Especially if you’re asking about something you’re not educated or experienced with
That’s the biggest problem for me. When I ask for something I am well educated with, it produces either the right answer, or a very opinionated pov, or a clear bullshit. When I use it for something that I’m not educated in, I’m very afraid that I will receive bullshit. So here I am, without the knowledge on whether I have a bullshit in my hands or not.

Hallucination is not just a mistake, if I understand it correctly. LLMs make mistakes and this is the primary reason why I don’t use them for my coding job.
Like a year ago, ChatGPT made out a python library with a made out api to solve my particular problem that I asked for. Maybe the last hallucination I can recall was about claiming that
manualis a keyword in PostgreSQL, which is not.