
You just gotta imagine that there are countless young people like her out there confiding in a machine because they feel like they can’t talk to anyone else. And the companies that build them could create safeguards but they refuse to do so because there’s a chance it would eat into their profits or hurt their chances of cracking agi first
The gist of it was that she confided in the ai instead of her parents, friends, or therapist. It gave some good advice and some bad, but ultimately she chose to confide in it because she knew it would be the easiest way for her to kill herself without getting anyone she loved involved. The author (the mom I believe) was making the point that had it been a human, or had there been safeguards in place, that some red flag procedures could’ve been triggered that might have saved this woman’s life