From the early days of ChatGPT, I told you that generative AI products like OpenAI's viral chatbot can hallucinate information. They make up things that aren't true, so you should always verify their claim when looking for information, especially now that ChatGPT is also an online search engine.
What's even worse is that ChatGPT can hallucinate information about people and hurt their reputations. We already saw a few complaints from affected individuals who found the AI spitting out false information about them that could damage their reputation. But the latest such case is even worse and definitely deserves action from both regulators and OpenAI itself.
The AI said a Norwegian man murdered two of his children and spent two decades in prison when the user asked ChatGPT what information it had on him. None of that was true. Well, some of the information the AI presented about the man was accurate, but not the gruesome parts. Whatever the case, a privacy rights advocacy group has filed a complaint against OpenAI in Norway that shouldn't be ignored.
Continue reading...
The post ChatGPT hallucinates a man murdered his kids: OpenAI faces a complaint in Europe it shouldn’t ignore appeared first on BGR.
Today's Top Deals
- Today’s deals: $10 Amazon credit, $679 Apple Watch Ultra 2, $149 Bose earbuds, $90 Samsung 1TB microSD, more
- Mother’s Day gift ideas 2024: Thoughtful gifts mom will never forget
- Today’s deals: First iPad 11 discount, free aosu security camera, Breville espresso machine, more
- Today’s deals: $160 iPhone SE 3, 35% off Red Bull, Sonos speakers, 48% off DEWALT power tools, more
Read More
By: Chris Smith
Title: ChatGPT hallucinates a man murdered his kids: OpenAI faces a complaint in Europe it shouldn’t ignore
Sourced From: bgr.com/tech/chatgpt-hallucinates-a-man-murdered-his-kids-openai-faces-a-complaint-in-europe-it-shouldnt-ignore/
Published Date: Thu, 20 Mar 2025 12:52:00 +0000