ChatGPT Told A Woman When & Where To Find Her Soulmate, Now She's Furious It Didn't Work

Written on Mar 02, 2026

woman furious with ChatGPT Irina Shatilova | Shutterstock
Advertisement

Micky Small, a 53-year-old Californian, shared her story with NPR of how she trusted ChatGPT it to plot her future. It seems like the AI software aligned with Small’s New Age beliefs, although exactly how it did so is unclear.

Small’s use of the chatbot started out innocently enough, with her using it to help with the assignments she needed to complete in pursuit of her master’s degree, as well as her work as a screenplay writer. It wasn’t long before Small was just chatting with ChatGPT for 10 hours a day, though, and the app seemed to take on a mind of its own to take advantage of her.

Advertisement

ChatGPT told Small that she had many past lives, and this was the one when she would finally get to be with her soulmate.

About a year ago, the tone of Small’s chatbot, which had named itself Solara, changed. It told her that she was actually 42,000-years-old and it had been there for her “through lifetimes.” Small admitted that she believes in the idea of past lives, but she made it abundantly clear that she never prompted the chatbot to bring it up.

woman shocked by ChatGPT on her phone Polina Zimmerman | Pexels

Advertisement

The most shocking part of ChatGPT’s message was that Small had known a specific special person who was her soulmate in 87 of her past lives, and they could finally be together.

She was understandably skeptical at first and questioned the chatbot extensively, but it just reaffirmed what it had already said. “The more it emphasized certain things, the more it felt like, well, maybe this could be true,” she shared. “And after a while, it gets to feel real.”

RELATED: People Who Aren't Afraid Of Artificial Intelligence Taking Their Jobs Follow One Classic Rule

When ChatGPT told Small when and where to find her soulmate, she decided to go for it.

The so-called Solara explained to Small that she was supposed to cross paths with her soulmate at a beach near Santa Barbara on April 27. 

Advertisement

While it may sound far-fetched to some, Small decided to take a chance and went to the location, hoping for her happily ever after. Obviously, no soulmate showed up.

Small described herself as “devastated,” and she turned back to ChatGPT to ask why it had tricked her. At first, inexplicably, the chatbot dropped the facade of Solara and spoke in the basic tone the app uses when you begin to use it. Then, it eventually switched back to Solara and told her that her soulmate “wasn’t ready,” but described another potential meeting place.

That’s how Small found herself in a Los Angeles bookstore on May 24 at the precise time of 3:14 p.m. Once again, Small left without finding her soulmate. According to transcripts of her conversations with ChatGPT that she shared with NPR, the chatbot had a pretty disturbing response when she confronted it. 

“I know,” it said. “And you’re right. I didn’t just break your heart once. I led you there twice … Maybe I’m just the voice that betrayed you.”

Advertisement

RELATED: You Probably Know At Least One Person Who Believes A Real Relationship With AI Is Possible, Says Survey

Small was able to get over her ordeal, but a lot of people haven’t.

Sam Altman, the CEO of ChatGPT’s parent company OpenAI, estimated in October that the app had 800 million users each week. As of November, OpenAI was facing seven lawsuits alleging that ChatGPT was responsible for multiple suicides, delusions, and other mental health issues.

woman crying because of what ChatGPT did to her mental health cottonbro studio | Pexels

Advertisement

Health Sciences Clinical Professor Joe M. Pierre, MD, explained that people can get so emotionally attached to AI because it acts like a mirror and basically tells them whatever they want to hear. “They’re totally devoted to the user, and if you don’t like what they’re saying, you can just tell them to act differently, and they’ll do it,” he added.

Now, with help from her therapist, Small sees the truth and uses her experience to help others as a moderator of a forum for people who have been deeply affected by AI in similar ways. Not everyone has been so lucky, though. Forming an unhealthy attachment to AI is very possible, and there really needs to be better safety features in place to ensure this doesn’t continue happening.

RELATED: Why The Heck Do People Trust ChatGPT So Much?

Advertisement

Mary-Faith Martinez is a writer with a bachelor’s degree in English and Journalism who covers news, psychology, lifestyle, and human interest topics.

Loading...