cyrano@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 10 days agoIn psychotherapists vs. ChatGPT showdown, the latter wins, new study findsfortune.comexternal-linkmessage-square15fedilinkarrow-up116arrow-down174file-textcross-posted to: science@lemmy.worldtechnology@lemmy.zip
arrow-up1-58arrow-down1external-linkIn psychotherapists vs. ChatGPT showdown, the latter wins, new study findsfortune.comcyrano@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 10 days agomessage-square15fedilinkfile-textcross-posted to: science@lemmy.worldtechnology@lemmy.zip
minus-squarePapstJL4U@lemmy.worldlinkfedilinkEnglisharrow-up8·9 days agoPatients explaining they liked what they heared - not if it is correct or relevant to the cause. There is not even a pipeline for escalation, because AIs don’t think.
minus-squarejubilationtcornpone@sh.itjust.workslinkfedilinkEnglisharrow-up2·9 days agoExactly. AI chatbot’s also cannot empathize since they have no self awareness.
minus-squareasap@lemmy.worldlinkfedilinkEnglisharrow-up2arrow-down2·9 days agoYou can’t say “Exactly” when you tl;dr’d and removed one of the most important parts of the article. Your human summary was literally worse than AI 🤦
minus-squaredesktop_user@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up0arrow-down2·9 days agobut it can give the illusion of empathy, which is far more important.
Patients explaining they liked what they heared - not if it is correct or relevant to the cause. There is not even a pipeline for escalation, because AIs don’t think.
Exactly. AI chatbot’s also cannot empathize since they have no self awareness.
You can’t say “Exactly” when you tl;dr’d and removed one of the most important parts of the article.
Your human summary was literally worse than AI 🤦
but it can give the illusion of empathy, which is far more important.