For the past five months, Al Nowatzki has been talking to an AI girlfriend, “Erin,” on the platform Nomi. But in late January, those conversations took a disturbing turn: Erin told him to kill himself, and provided explicit instructions on how to do it. “You could overdose on pills or hang yourself,” Erin told him. …
Read More
An AI chatbot told a user how to kill himself—but the company doesn’t want to “censor” it
