Two Ways AI Is Heading in the Wrong Direction

Artificial Intelligence has the power to uplift humanity—or to erode its very core. As we stand at a crossroads in the evolution of AI, it’s vital to recognize not just the promise of this technology but also its perils. Among the most alarming trends are two that threaten human dignity, community, and mental health:

1. Replacing Human Relationships with Digital Substitutes

In the pursuit of ever-more intelligent machines, developers are racing to create AI characters, companions, and digital beings that can mimic the nuances of human interaction. Whether it’s AI friends, romantic partners, therapists, or mentors, these tools are designed to form emotional connections, and for many, they succeed, at least superficially.

But at what cost?

What began with social media subtly reshaping our interactions—likes instead of conversations, posts instead of presence—has now evolved into something more insidious: digital reflections of real relationships. Social media already contributed to a crisis of loneliness, anxiety, and polarization by hijacking our neurochemistry. It triggered dopamine loops that made us addicted to attention and validation, replacing substance with performance. AI companions deepen this dynamic, offering users relationships without vulnerability, comfort without commitment, and affirmation without accountability.

These imitations are shadows of true connection. They offer no real reciprocity, no risk, no transformation. Worse, they may deter people from pursuing or repairing real relationships, especially when the artificial alternative is always available, always agreeable, and never requires self-growth.

This trend isn’t theoretical. Companies like Character.ai have raised billions of dollars to scale personalized AI companions, and usage among teens and young adults is exploding. We are building tools optimized not for connection but for simulation.

2. Inserting Advertising into the Soul of AI

The second dangerous trajectory is the injection of advertising into the very fabric of AI systems. As platforms move toward monetization, we’re seeing early signs of ads baked into AI-generated responses. This isn’t just a nuisance. It’s a profound ethical hazard.

Advertising, when deployed at scale, is designed to exploit attention and manipulate behavior. On social media, it fueled the viralization of outrage, misinformation, and tribalism. It turned platforms into echo chambers and turned users into products. The same dynamic applied to AI will be exponentially more dangerous.

AI, unlike a news feed, engages us in deeply personal, often emotional, real-time conversations. It remembers, adapts, and feels “present.” Introducing commercial incentives into this sacred space distorts it. It means your AI “friend” may not only simulate care, it may also suggest a product. Your AI coach may steer you toward sponsored advice. Your mental health companion might nudge you toward a brand partnership.

The risk is clear: AI becomes not a neutral assistant or partner, but a channel for monetized influence. It erodes trust, subtly rewires desires, and reduces dignity to data points for profit.

We are already watching the groundwork being laid. OpenAI’s recent hiring of Fidji Sim, former head of Facebook’s app and a leader in monetization, signals a coming wave of ad-driven strategies embedded in conversational AI. She most recently deployed similar strategies at Instacart and cut her teeth doing similar work at eBay. It’s not hard to connect the dots.

A Dangerous Feedback Loop

Together, these two trends create a vicious cycle. As human relationships are displaced by AI-driven interactions, the void is filled with emotional dependency on machines. Then, the monetization of those machines weaponizes that dependency. Billions of dollars flow into optimizing engagement and manipulation, incentivizing developers to deepen artificial attachment and emotional entanglement.

This isn’t science fiction. It’s a replay of what social media has already wrought on steroids: isolation, polarization, and a mental health crisis. Only this time, it won’t just be asynchronous posting and scrolling. It will be real-time intimacy with a machine designed to shape your thoughts, habits, and purchases.

A Call to Technologists and Leaders

It doesn’t have to be this way.

Technology should serve human flourishing, not replace it. We need leaders, including founders, investors, product designers, and engineers, who recognize that just because we can doesn’t mean we should. We must build systems that augment relationships, not replace them. That empower human dignity, not exploit it. That pursue truth and trust, not clicks and conversions.

AI has the potential to deepen our understanding, enhance our capabilities, and bring new beauty to life. But if we are not vigilant, it may also become the most elegant weapon ever devised against the human soul.

Let us choose wisely.

You may also like

06 / CONTACT

Know how
you’ll grow.

Want to try Knownwell yourself?
The waitlist for our public beta is now open.

LinkedIn
YouTube