In this episode, I talk about:
- The phenomenon of AI hallucinations – false or misleading information that is presented as fact
- How and why AI hallucinations occur with platforms such as ChatGPT
- 6 hilarious and scary examples of ChatGPT hallucinations, including:
- Lying about how many people survived the sinking of the Titanic
- Fabricating scientific references to support the idea that cheese is bad for your health
- Writing a New York Times opinion piece about why mayonnaise is racist
- Making up a historical French King
- Writing a rave review for the ill-fated Fyre Festival
- Inventing a world record for man walking on water
- Why you should still fact-check ChatGPT’s responses, despite improvements in the AI’s accuracy
View show notes including links to all the resources and tools mentioned: https://thedigitaldietcoach.com/025
Get your free #TechTimeout Challenge 30 Day Digital Detox Guide
Keep in touch with me:
----------------------------------------
Music by FASSounds