Sidechat icon
Join communities on Sidechat Download
hey maybe we need to regulate this shit immediately
31 upvotes, 6 comments. Sidechat image post by Anonymous in US Politics. "hey maybe we need to regulate this shit immediately"
upvote 31 downvote

🐸
Anonymous 18w

Yeah that was a hard one to read ngl. It feels like we’re nearing the crossroads where we either meaningfully regulate this stuff or we accept that we’re headed towards a blade runner type dystopian hell scape

upvote 11 downvote
default user profile icon
Anonymous 18w

What did she say

upvote 1 downvote
default user profile icon
Anonymous 18w

What is she supposed to be standing behind

upvote 1 downvote
🐸
Anonymous replying to -> fuuuckyikyak 18w

You just gotta imagine that there are countless young people like her out there confiding in a machine because they feel like they can’t talk to anyone else. And the companies that build them could create safeguards but they refuse to do so because there’s a chance it would eat into their profits or hurt their chances of cracking agi first

upvote 7 downvote
🐸
Anonymous replying to -> #1 18w

The gist of it was that she confided in the ai instead of her parents, friends, or therapist. It gave some good advice and some bad, but ultimately she chose to confide in it because she knew it would be the easiest way for her to kill herself without getting anyone she loved involved. The author (the mom I believe) was making the point that had it been a human, or had there been safeguards in place, that some red flag procedures could’ve been triggered that might have saved this woman’s life

upvote 3 downvote
🐸
Anonymous replying to -> fuuuckyikyak 18w

Like how therapists are mandated reporters who can (if necessary) have someone involuntarily committed to a hospital if they are at serious risk of harming themselves

upvote 3 downvote