• 0 Posts
  • 1 Comment
Joined 2Y ago
cake
Cake day: Jul 08, 2023

help-circle
rss

I believe this phenomenon is called “artificial hallucination”. It’s when a language model exceeds its training and makes info out of thin air. All language models have this flaw. Not just ChatGPT.