Hallucinations are a common occurrence with AI tools. This is an essential part of predictive algorithms. Hallucinations occur when the predictions don't match reality. Watch the video below to learn more about why hallucinations happen. Use this link. if the embedded video doesn't work.
One research study. found that 52% of ChatGPT responses contained inaccuracies. These can come in the form of simple errors such as incorrect names, or dates. However, when those mistakes include omissions it’s hard for readers to identify what is missing. The same study identified that users overlooked mistakes 39% of the time. Users also preferred the ChatGPT response due to its authoritative writing style. This type of writing style makes it difficult to evaluate topics that you're relatively new to.