The AI Companion Paradox: What It Reveals About Emotional Intelligence

The AI Companion Paradox: What It Reveals About Emotional Intelligence

Something interesting is happening; increasingly, people are turning to AI not just for answers, but for comfort. Recent data suggests that nearly three in ten Australians have opened up emotionally to an AI chatbot, and a growing number say they would sometimes prefer talking to an AI tool over going out socially.

Rather than reacting with alarm, it’s worth asking a more thoughtful question:

What does this trend reveal about human needs — and what does it mean for leaders?

Emerging research from MIT and OpenAI offers a compelling lens. In March 2025, researchers analysed nearly 40 million ChatGPT interactions and conducted a four-week controlled study with close to 1,000 participants. They found that heavier chatbot use was associated with higher reported loneliness, greater emotional reliance on AI, and reduced socialisation with others.

But perhaps the most important insight was this:

Outcomes were shaped less by the technology itself and more by how people used it and what they brought to the interaction.

Those who anthropomorphised the chatbot, trusted it deeply, or felt socially drawn to it experienced more negative effects.

In other words, AI may not be the core issue. Human psychology is.

 

A Different Developmental Landscape

In Australia, two in five people report feeling lonely even when surrounded by others. Among Gen Z, that number approaches half.

Common reasons include:

  • “I find it hard to know what to talk about.”
  • “I find it hard to start conversations.”

Before we interpret this as a generational weakness, it may be more accurate to see it as a generational shift.

As a Gen Xer, I recognise these same feelings from my own younger years. The difference wasn't that my generation was inherently more socially capable, it's that we had no alternative but to stumble through the awkwardness. There was no AI to turn to when we didn't know what to say at a party, no chatbot to rehearse difficult conversations with and no digital companion to fill the silence. We had to learn by doing - often badly. We made cringe-worthy small talk, endured uncomfortable silences, said the wrong thing and gradually (painfully) we developed the muscle memory of human connection. It wasn't that we were better at emotional and social functioning; we were simply forced to build those capabilities because there was no shortcut available.

Today, younger generations are navigating social development in a world where frictionless alternatives exist and that changes the equation.

The question for leaders isn’t whether this is good or bad.

It’s how do we help people build relational confidence in a world where opting out of discomfort is easier than ever?

 

What This Means for Leadership

The rise of AI companions doesn’t signal the decline of emotional intelligence. If anything, it highlights its enduring relevance. When people turn to AI for connection, it may indicate unmet needs: for psychological safety, for non-judgemental listening, or for emotional processing.

Technology is responding to demand and that demand offers leaders insight.

  • If people are seeking low-risk, low-friction conversation spaces, what might that suggest about the relational climate in our teams?
  • If conversational confidence is declining, how intentionally are we developing it?
  • If people find it easier to speak vulnerably to a chatbot than to a colleague, what does that reveal about psychological safety?

These are not reasons for fear, but invitations for reflection.

 

The Emotional Intelligence Capabilities That Matter

The research points toward several areas where emotional intelligence appears protective in a developmental sense.

1. Self-Awareness of Emotional Needs

Some individuals who turned to AI frequently for companionship struggled to articulate what they were truly seeking.

Emotionally intelligent leaders can identify when they feel disconnected, under-stimulated, or unsupported. More importantly, they can choose how to respond. They recognise the difference between temporary relief and meaningful connection.

Developing this awareness across teams strengthens resilience.

2. Social Confidence and Relationship Skills

When someone says, “I don’t know what to talk about,” they are not describing a personality flaw, they are describing a skill gap.

Conversation initiation, perspective-taking, feedback, and conflict navigation are all learned capabilities. If we don’t teach and practise them, they don’t automatically develop.

AI interactions require no vulnerability, courage, or real-time emotional reading, but human interactions do and that difference is precisely where growth occurs.

Leaders who deliberately create safe opportunities to practise these skills are investing in long-term relational strength.

3. Genuine Empathy

The MIT research noted that “personal” conversations with AI i.e. those involving emotional sharing, were associated with higher loneliness.

True empathy is not pattern-recognition, it is presence. It involves holding emotional space, tolerating ambiguity, and responding from lived human experience. When leaders model genuine empathy they create cultures where people feel understood by someone who can actually understand.

4. Emotional Regulation

Extended chatbot use, particularly in voice mode, correlated with worse outcomes when prolonged. This may point to avoidance i.e. using AI as an emotional regulator instead of building tolerance for discomfort.

Emotionally intelligent leadership helps to normalises discomfort, because it helps people sit with difficult emotions, process them constructively, and seek appropriate human support. In turn, that capacity builds maturity and psychological resilience.

 

Leading Thoughtfully in the Age of AI

So, what does emotionally intelligent leadership look like in this context?

It may involve:

  • Modelling authentic connection. Prioritising unscripted conversations and demonstrating that imperfect human interaction is valued.
  • Creating psychological safety. Ensuring people feel safe to practise, stumble, and learn socially.
  • Teaching relational skills explicitly. Not assuming that digital fluency translates to interpersonal fluency.
  • Designing for human presence. Structuring team rhythms that require genuine interaction rather than transactional exchange.
  • Pausing before outsourcing reflection. When tempted to “ask AI” about a people issue, asking instead: What might I learn by thinking this through myself first?

This is not about rejecting technology - AI can be an extraordinary tool for rehearsal, idea generation, emotional labelling, and perspective expansion - the key is intentional use.

 

What the Gen Z Research Adds

Further research into Generation Z employees offers a complementary insight. A 2024 study involving 568 Gen Z participants found that emotional intelligence significantly buffered the negative effects of social isolation on quality of life.

Higher EI weakened the link between isolation and loneliness. It helped individuals maintain wellbeing even in challenging social conditions.

The implication is not that isolation disappears, but that capability changes impact.

If loneliness is becoming a broader societal concern, emotional intelligence may be one of the most practical levers available to leaders. Not as a “soft skill”, but as a protective factor.

 

The Bigger Reflection

Perhaps the rise of AI companions is not fundamentally about technology. Perhaps it is a signal that many people want:

  • To be heard.
  • To practise conversations safely.
  • To process emotions without judgement.
  • To feel understood.

If AI tools are meeting that need, even imperfectly, it suggests an opportunity for organisations to ask: Are we meeting it better?

Emotional intelligence — developed deliberately and applied consistently — strengthens human connection in ways no algorithm can replicate. It builds relational courage, fosters psychological safety, enables meaningful disagreement and deepens trust.

Technology can simulate empathy, but only humans can experience it.

 

Where to From Here?

For leaders committed to building emotionally intelligent cultures, this moment offers clarity:

  1. Prioritise EI development as foundational.
  2. Measure relational health, not just productivity.
  3. Design environments where human care is visible.
  4. Explicitly equip younger employees with conversational and emotional skills.
  5. Use AI as a supplement - not a substitute - for connection.

The rise of AI companions is not inherently alarming. It is revealing.

It reveals our enduring need for belonging, skill gaps worth addressing and the importance of intentional leadership.

The future will undoubtedly include increasingly sophisticated technology. The question is not whether AI will be present in our lives, it is whether we will remain equally committed to strengthening the human capabilities that give our lives depth, resilience, and meaning.

That is the work of emotional intelligence.

And it remains profoundly human.

 

Building the Skill of Real Connection

Knowing that emotional intelligence matters is only the beginning. The harder question is: how do we actually help people practise the conversations they find difficult?

This is where tools like the Emotional Culture Deck (ECD) offer something genuinely practical. The ECD is a card-based tool designed to create structured, face-to-face conversations about how people feel - and how they want to feel - at work. It doesn’t require people to be naturally articulate about emotions. Instead, it gives teams a shared language and a low-risk way to explore what connection, psychological safety, and belonging look like for them.

Importantly, the ECD doesn’t just surface feelings, it helps people and teams move toward action. It prompts questions like:

How do your people need to feel to be successful and resilient? What steps can you take to make that a reality?  OR

What do we need to feel to be successful and how do we curate that environment for ourselves?

These are exactly the kinds of conversations that build the relational confidence and psychological safety this blog has been exploring.

If you are wondering where to begin, this is a tangible, research-backed starting point. It meets people where they are, builds the conversational muscle memory that so many are missing, and creates the kind of shared emotional language that makes teams stronger.

To find out more about the Emotional Culture Deck visit - https://www.neuralnetworks.com.au/certification-emotional-culture-deck

References

  • Fang, C.M., Liu, A.R., Danry, V., et al. (2025). How AI and Human Behaviors Shape Psychosocial Effects of Extended Chatbot Use: A Longitudinal Controlled Study. MIT Media Lab & OpenAI.
  • YouGov Australia (September 2025). Two in Five Aussies Feel Lonely — and Many Are Turning to AI.
  • Zhang, Y., Zhao, D., Hancock, J.T., Kraut, R., & Yang, D. (2025). The Rise of AI Companions: How Human-Chatbot Relationships Influence Well-Being.

 



Sign up for the Neural Networks Newsletter

Join our mailing list to receive information on leadership, sales, and emotional intelligence.

Interested in: (Select all that apply)
Sales
Leadership
Emotional Intelligence
Free eBook
Get Your Free Emotional Intelligence eBook

Get your free eBook introducing five pillars of emotional intelligence from the EQ-i 2.0® model-trusted by coaches, leaders, and HR professionals worldwide.

Download your free eBook