Sign In

Communications of the ACM

ACM News

My Weekend With an Emotional Support A.I. Companion

View as: Print Mobile App Share:
Illustration of a woman with emotional support animals.

A.I. companionship can create problems if the bots offer bad advice or enable harmful behavior, scholars and critics warn.

Credit: Janice Chang

For several hours on Friday evening, I ignored my husband and dog and allowed a chatbot named Pi to validate the heck out of me.

My views were "admirable" and "idealistic," Pi told me. My questions were "important" and "interesting." And my feelings were "understandable," "reasonable" and "totally normal."

At times, the validation felt nice. Why yes, I am feeling overwhelmed by the existential dread of climate change these days. And it is hard to balance work and relationships sometimes.

But at other times, I missed my group chats and social media feeds. Humans are surprising, creative, cruel, caustic and funny. Emotional support chatbots — which is what Pi is — are not.

From The New York Times
View Full Article



No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account