Chatgpt Dating Advice Is Feeding Delusions And Causing Unnecessary Breakups

Earlier this year, I wrote an article about a man whose girlfriend uses ChatGPT for relationship advice. Originally, he wrote to me expressing both his intrigue and discomfort with the situation.
“My girlfriend keeps using ChatGPT for therapy and asks it for relationship advice,” he told me. “She brings up things that ChatGPT told her in arguments later on.”
At the time, I was a bit floored by this confession. I had no idea people were actually turning to AI for advice, much less input on their relationships.
However, the more I explored the topic, the more I realized how common it was to seek help from AI—especially in an era where therapy is an expensive luxury.
But can AI really give objective, applicable dating advice, or are you just shooting yourself in the foot by relying on a robot for its perspective?
Is ChatGPT a ‘Yes Man’?
My one friend mentioned that she’ll occasionally type her dating woes into ChatGPT for quick insight. She originally thought of it as a way to bounce ideas off a non-biased source. However, after some time seeking input from the generative AI platform, she noted that ChatGPT seemed to heavily validate her experience, perhaps dangerously so.
That begs the question: Is AI actually objective, or is it more of a “yes man”?
About a month ago, someone posed a similar question on Reddit, asking whether the AI tool is feeding our delusions. In the post, the user specifically brought up an “AI-influencer” who seemingly was receiving extreme validation and praise from ChatGPT, stating that it “blows so much hot air into her ego” and “confirms her sense of persecution by OpenAI.”
“It looks a little like someone having a manic delusional episode and ChatGPT feeding said delusion,” the poster wrote. “This makes me wonder if ChatGPT, in its current form, is dangerous for people suffering from delusions or having psychotic episodes.”
If this is true, not only will this negatively impact the mental health of such individuals using AI in this way, but it can also damage our relationships.
I mean, imagine going to ChatGPT for dating advice and constantly hearing that you’re in the right while your partner is wrong. We already have enough toxic, selfish individuals and “narcissists” in the dating world. We don’t need ChatGPT validating these people even further.
Can AI Dating Advice cause breakups?
Now, let’s get one thing straight: AI cannot directly cause a breakup. Ultimately, you make the decision yourself. If you’re unhappy, unfulfilled, or being treated poorly, by all means, you’ll know in your heart that it’s time to leave.
But if you’re relying on ChatGPT to make that decision for you, well, you might be doing yourself, your relationship, and your partner a disservice.
Think about it like this: when you speak to a therapist or even a trusted friend about your relationship, you’ll be able to get a more emotional and, well, human response. Someone who has your best intentions will consider both your perspective and your partner’s perspective while factoring in your unique experiences and personal struggles.
For example, I often and openly write about my struggles with obsessive-compulsive disorder (OCD). If I went to ChatGPT for dating advice and failed to mention how my OCD tends to attack my relationships, I might receive unhelpful, even harmful, input about my relationship.
On the subreddit ROCD (which is for those who suffer from Relationship-OCD, the type that fixates on your relationship), someone even shared that ChatGPT told them to break up with their partner.
In fact, the Reddit account for NOCD, an official OCD Treatment and Therapy service, responded with a great explanation about this: “It may feel like ChatGPT has all the answers, but understand that engineers work hard to make the program sound authoritative and all-knowing when, in reality, that comes with a lot of caveats,” they wrote. “AI LLMs are not the most trustworthy programs. While they can give eloquent-sounding answers, the programs often ‘hallucinate,’ give inaccurate information, and cite unrelated studies.”
What’s more, someone who tends to be a little more selfish in their dating might only provide their biased side of the story and receive even more self-validation, believing their needs are more important than their partner’s.
One person on the Reddit thread from earlier in this article said that ChatGPT “consigns my BS regularly instead of offering needed insight and confrontation to incite growth.”
That’s a serious issue.
When in doubt, don’t use ChatGPT for dating advice. And if you decide to ask it anyway, at least take its input with a grain of salt.
The post ChatGPT Dating Advice Is Feeding Delusions and Causing Unnecessary Breakups appeared first on VICE.
Popular Products
-
Adjustable Waist Trainer Belt
$43.99$29.78 -
Butt Lifting Body Shaper Shorts
$78.99$54.78 -
Upgraded Butt Lifting Bodysuit
$83.99$57.78 -
Ice Cooling Painless Hair Removal Device
$237.99$165.78 -
Couples Card Game
$47.99$32.78