News

Children’s Privacy: No, ChatGPT, saying you might have hallucinated kids got murdered is not a valid excuse

A Norwegian man curious to find out what, if anything, ChatGPT knew on a very familiar topic – researched himself. And it did in fact know more than he felt it needed to about himself, his family and location – but what stopped him in his tracks was it also erroneously claimed he was a convicted criminal, who’d killed several of his children and that he’d been in prison.

All possible when ChatGPT hallucinates as it sometimes does – and terrifying for us in the real world!