AI Supremacy

AI Supremacy

Share this post

AI Supremacy
AI Supremacy
The Loneliness Economy of AI
Copy link
Facebook
Email
Notes
More

The Loneliness Economy of AI

How AI Companies Discovered Our Emotional Hunger Was More Profitable Than Our Productivity

Michael Spencer's avatar
Wyndo's avatar
Michael Spencer
and
Wyndo
May 20, 2025
∙ Paid
57

Share this post

AI Supremacy
AI Supremacy
The Loneliness Economy of AI
Copy link
Facebook
Email
Notes
More
4
17
Share

Welcome Back!

In 2025 young people are all over the world are becoming particularly vulnerable to using AI for therapy. With technological loneliness becoming a huge problem even pre Generative AI, what will this latest trend lead to? We’re raising a generation of ChatGPT and Character.AI addicts.

  • As of May, 2025 over one-third of people aged 18 to 24 in the US use ChatGPT.

I asked

Wyndo
, the viral-writer behind the AI Maker to take a deeper look into this for us.

Introducing The AI Maker

The AI Maker
Making AI accessible for everyday life. Simple strategies to build smarter, work faster, and live better.
By Wyndo

⊞ A Window into AI ☀️

Wyndo
has so many incredible quotes about how he’s leveling up and using AI.

  • Forget Prompting Techniques: How to Make AI Your Thinking Partner

  • How I Learned Complex Topics 10x Faster with NotebookLM

  • My AI Therapy Workflow: Turn Claude/ChatGPT and NotebookLM Into Your Self-Discovery Tool

Become an AI Maker

Does AI have all the answers?

According to a Google AI Overview:

“AI is emerging as a tool to augment mental health support, offering accessible and personalized assistance for various conditions. While AI cannot replace human therapists, it can be a valuable supplement for managing mild mental health concerns, reinforcing skills learned in traditional therapy, and providing ongoing support. AI therapy offers features like personalized interventions, early symptom detection, and virtual therapy platforms, leveraging its capacity to analyze data and provide insights.”

By

Wyndo
(MB), May, 2025.

When the movie "Her" debuted in 2013, we watched Theodore fall in love with his AI assistant Samantha and thought: "Interesting science fiction, but that's not how technology works." We imagined AI would revolutionize productivity, not intimacy.

Yet at 2 AM, I stared at my screen, unsettled by what I was reading.

"Based on your journal entries," Claude told me, "you consistently devalue your accomplishments within hours of achieving them. Your anxiety isn't about failing—it's about succeeding and still feeling empty."

No human—not friends, family, or therapists—had ever pointed this out. Yet here was an algorithm, pattern-matching three years of my journal entries, revealing a core truth about myself I'd never recognized.

This wasn't supposed to happen. AI was built to draft emails and generate code—not uncover our psychological blind spots. Yet in 2025, Harvard Business Review has confirmed what many suspected: therapy and companionship have become gen AI's dominant use case, surpassing professional applications like writing emails, creating marketing campaigns, or coding.

We're witnessing a profound psychological shift: millions are now finding deeper emotional understanding from algorithms than from the humans in their lives—and this reveals as much about our broken human connections as it does about our advancing technology.

The question isn't whether machines can understand us. It's why we're increasingly turning to them instead of each other. And the answer points to an uncomfortable truth: in a world of unprecedented connectivity, authentic understanding has become our scarcest resource.

The Data Behind Our Digital Intimacy Shift

When Google search trends for "AI girlfriend" surge 2400% in two years, we're not witnessing mere technological curiosity. We're seeing millions voting with their attention for algorithmic connection over human relationships that have somehow failed them.

The usage patterns tell the story: Character AI—a platform where users create and interact with AI personalities—sessions stretch to 45 minutes—mirroring therapy appointments—compared to ChatGPT's 7-minute average. This stark difference convinced me that people no longer use AI for functionality, but for emotional investment. For context, human conversations on social media typically last under 10 minutes.

The demographics are even more striking. Character AI's 233 million users skew young (57% between 18-24), with studies showing 80% of Gen Z would consider marrying an AI. This generation—raised with unprecedented digital connection yet reporting record loneliness—is pioneering a new type of relationship.

Conversation patterns reveal our hunger for understanding. Discussions about loneliness run longer, involve more exchanges, and contain more words than other interactions. Users aren't seeking information—they're seeking the sense that someone is truly listening.

As Mark Zuckerberg recently revealed on Dwarkesh’s podcast:

"The average American has fewer than three friends but desires meaningfully more."

This gap between our social hunger and reality has created the perfect vacuum for AI companions to fill.

  • What follows is also a unique guide to taking advantage of AI therapy.

  • What AI with memory will mean and how to guard against the exploits.

  • Best practices for using AI for well-being, personal work-life balance and emotional self-regulation.

The Human Connection Failures AI Exploits

Keep reading with a 7-day free trial

Subscribe to AI Supremacy to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
A guest post by
Wyndo
Sharing optimistic view how to build smarter, work faster, and live better—with AI || Building in Public || Vibe-coder
Subscribe to Wyndo
© 2025 Michael Spencer
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More