Friday, June 13, 2025

Hallucination rates on the rise

UPDATE:  I picked up some shares of Google today in spite of this post.  Since I now have a financial interest in the company, I'm disclosing that when necessary.

 

Well, this is a bit surprising.  I knew LLMs hallucinate from time to time, but I didn't expect that hallucination rate to rise.

https://finance.yahoo.com/news/rates-hallucination-ai-models-google-173138158.html

Are these things actually getting worse, or is this just a statistical blip?

Personal anecdote time:  A few days ago, I googled "what events happened on (my birthday)" to see what Google's search AI would say about it.  I typed out the exact date and spelled out the full name of the month so that there was no excuse for interpreting the date incorrectly.  The LLM gave me several events.  The first two events did NOT occur on my birthday but rather a month earlier according to Wikipedia.  But the AI said in its answer that they happened on (my birthday).  It straight-up lied to me.

This is a very easy thing to get right.  There's no excuse for this. 

No comments:

Post a Comment