I'm curious how prevalent is the practice of relying exclusively on a LLM for the information needed to resolve a question. Is this considered a good practice? Does the LLM matter?
Absolutely not. This site's userbase has a real problem with thinking LLM's are the ultimate manifestation of Logos on Earth. No, they get things laughably wrong, and hallucinate nonsense all the time. No serious person should currently (and I'm not exactly convinced they ever should) trust anything half complicated an LLM spits out, without verifying the information thoroughly.
@ProjectVictory Exactly, and there are others. Nothing against using AI to gather info, but if you're going to use it as an oracle, you should include that in the initial description so people can stay away.