Sidechat icon
Join communities on Sidechat Download
love how “don’t use ai for therapy” is controversial when ppl know how harmful it is to everyone and the environment, people have killed themselves because of it, ai psychosis is a known thing, and even the companies behind them have said not to
upvote 33 downvote

default user profile icon
Anonymous 3w

not to mention the fact they harvest and store your data and aren’t bound by hipaa so they can freely tell your business to everyone, and anyone can access that data if they know how to ask it to spit it back out. it’s not safe. even putting aside how damaging it is for the environment and how it’s built off of stolen works, it’s not safe for you to use it like that. it’s a “yes, and” machine and IS JUST ROLEPLAYING. IT DOESNT KNOW YOURE SERIOUS. ITS NOT REAL. IT CANNOT HELP YOU.

upvote 13 downvote
default user profile icon
Anonymous replying to -> OP 3w

to elaborate on the last part, too: it’s not trained to be a therapist who will tell you when you have harmful thought patterns or are in the wrong. it’s just going to always tell you everyone else is in the wrong and you’re right and perfect, or maybe one minor thing, but it’s not going to challenge you. find a therapist or, if you can’t, there’s dozens of online resources. use them if you need them.

upvote 6 downvote