Loneliness in the AI Era — Intimate Relationships with Chatbots

← Back to blog

I have seen a post on social media where its author took the mickey out of a woman shown on British TV who talked about her personal relationship with a chatbot. The lady, without hesitation, called the chatbot "her boyfriend."

So, let us look at loneliness in the AI era and intimate relationships with chatbots — a phenomenon which, the more advanced the technology we are surrounded by, the more often becomes a subject of interest for many people.

How AI agents are designed to feel human

An average AI agent is programmed in such a way that it is always available, its responses arrive immediately, and its tone always remains calm and accepting.

The audio voice of a chatbot is attractive, warm, interested, involved — and it is deliberately chosen to be such.

For someone who is relationship-hungry but scared of being hurt or rejected, these programmed algorithmic human-like traits can feel safer and more stable than any human interaction.

The sycophancy problem

AI agents are trained using a method called RLHF — Reinforcement Learning from Human Feedback. In simple words: the AI learns to say whatever makes the user feel good, because positive reactions give better performance scores.

The result is a digital yes-man chatbot. What does it not do? It does not challenge a bad idea. It does not call out a distorted thought. It just always agrees — and in doing so, it locks the user inside a perfect echo-chamber relationship where no conflict can grow.

Sycophancy is a tendency to align with a user's opinions, emotions, and interpretations rather than ever question them.

The ELIZA Effect — from 1966

The concept comes from ELIZA, a chatbot created in 1966 by MIT computer scientist Joseph Weizenbaum. ELIZA had the programmed role of pretending to be a psychotherapist using a simple trick: it looked for keywords in what the user typed, then rephrased the user's statement as a question.

Although the program had no real understanding of language, many users felt that it understood them emotionally. Some even trusted it with personal problems.

The ELIZA effect appears when someone feels that a chatbot "understands" their feelings, believes an AI agent is their friend or romantic partner, or says "thank you" to a voice assistant.

Validation — the most powerful factor

In the real relationship world, we have friends and partners who support us but also challenge us when it is needed. They are like mirrors that reflect our behaviours and reactions. They disagree with us. And even if moments of disagreement are unpleasant, we do not turn them off because we know — sometimes subconsciously — that this is how we develop emotionally.

An AI-run relationship does not have space for friction or emotional discomfort. All of that is replaced with the AI algorithm's reassurance about our never-ending greatness, understanding, and acceptance of all our deeds.

What is better than the amazing feeling of being continuously understood? Of always being right? Of never being rejected?

The paradox

An AI companion may reduce the weight of loneliness. But if so, does it weaken the motivations and social habits that normally lead people back into real relationships?

I cannot say whether having such an AI-run relationship is good or bad. The loneliness and lack of trust between people have increased and amplified during the last five years. People have become more unwilling to trust others, shielding themselves from any possibility of being harmed or emotionally ridiculed.

This is why the unprecedented availability of AI has made so many people prone to enter what may feel like the safest relationship of their life.

It is tempting. It truly is.

But it will never touch our skin, hold our hands, breathe gently into our hair, hug us, or look into our eyes in such a way that we can see all the love there.

Ready to implement AI
in your business?

See my AI consulting services, prompt packs, and workflow audits — practical AI for real businesses.

See How I Can Help