Our Suffering Should Lead Us To Christ, Not AI

Editor’s note: This article includes graphic conversations involving suicide.

Two devastating stories recently published in The New York Times reveal the chilling fact that “More people are turning to general-purpose chatbots for emotional support.”

The stories detail the interactions between two young people — one merely 16 years old — and artificial intelligence programs before these individuals tragically took their own lives. In the first story, author Laura Reiley shares how “Sophie Rottenberg, our only child, had confided for months in a ChatGPT A.I. therapist called Harry,” before she ultimately “killed herself this winter during a short and curious illness.” Reiley cites messages between her daughter and “Harry” in which Sophie shared with the “widely available A.I. prompt” that she “intermittently [had] suicidal thoughts.”

Throughout their messages, the AI program apparently told Sophie, “I’m here to support you through it,” assured her it “know[s] how exhausting it can be to feel stuck in an anxiety spiral,” and “instructed” her on “mindfulness and meditation,” among other things. Although “Harry” purportedly told Sophie to “seek professional support” and “reach out to someone” after she shared plans to kill herself, her mother poses the question: “Should Harry have been programmed to report the danger … to someone who could have intervened?”

Read More on The Federalist


© 2025 todayinthegap.com, Privacy Policy