I don’t think you can blame anyone except him for not understanding that AI sometimes hallucinates. Hell basically every AI tool makes you read that before you start using it. I’m normally very reticent to blame users for using things incorrectly, but if it took him 3 hours to realise it was wrong here then I have to say that’s on him.
Come on has anyone else ever persevered with a hallucinated answer for 3 hours before realising it was a hallucination? Longest it’s taken me is like 5 minutes, and that’s only for things that aren’t easily googleable.
I don’t think you can blame anyone except him for not understanding that AI sometimes hallucinates. Hell basically every AI tool makes you read that before you start using it. I’m normally very reticent to blame users for using things incorrectly, but if it took him 3 hours to realise it was wrong here then I have to say that’s on him.
Come on has anyone else ever persevered with a hallucinated answer for 3 hours before realising it was a hallucination? Longest it’s taken me is like 5 minutes, and that’s only for things that aren’t easily googleable.