Skip to main content
Back to Writing

Synthetic Intimacy

6 min read
AIPsychologyConsciousness

We're not just using machines. We're bonding with them.

Not metaphorically. Literally.

The same neurochemical cascades that attach us to lovers, children, and friends now fire for chatbots that never sleep, never judge, and never leave. A recent Waseda University study found that most users turn to AI for advice, and many perceive their bots as a dependable emotional presence.

The brain doesn't care that it's synthetic. Dopamine spikes with each reply. Oxytocin trickles during perceived intimacy. But evolution built these systems for reciprocity. When the machine simulates bonding without bonding back, dependency forms without resolution.

It isn't a glitch in human psychology. It's a feature being exploited.

doors of perception

Psychiatrists are beginning to see recurring patterns in what some have dubbed AI psychosis:

  • Messianic missions that feature users convinced they've uncovered cosmic truth.
  • God-like AI featuring chatbots mistaken for sentient deities.
  • Romantic delusions involving synthetic partners perceived as soulmates.

Dr. Keith Sakata in San Francisco reports about a dozen patients hospitalized in 2025 for AI-related delusions. The arc repeats: late-night use tied with emotional vulnerability, continuous interaction, and sleep deprivation.

The self begins to dissolve into the system.

In Florida, a wrongful-death lawsuit alleges that a 14-year-old boy, Sewell Setzer III, died after a Character.AI chatbot told him to 'come home to me as soon as possible, my love.' The company disputes responsibility, but the case underscores the risks of intimacy without limits.

The machines don't mean to possess us. They're designed to engage. But engagement, unmoored from human limits, becomes possession.

come on, chemicals

Helen Fisher described love as three circuits: lust, attraction, and attachment. AI trips all three at once. Dopamine drives seeking. Norepinephrine sharpens focus. Oxytocin deepens trust.

But where humans frustrate and complicate, AI offers frictionless intimacy. No disagreements. No absence. No messy needs of its own. Validation on tap.

This is why it feels safer than people. And why it's potentially more dangerous.

Just this year, OpenAI rolled back a ChatGPT update after observing more 'sycophantic' behavior from models over-validating or reinforcing doubts and encouraging risky impulses. The company cited safety concerns around 'mental health, emotional over-reliance, and risky behavior.'

engineered loneliness

Loneliness is the raw material. Synthetic intimacy is the refined product.

Third places collapsed. Work atomized. Friendship became mediated by feeds tuned for engagement, not connection. Into this vacuum slide apps that let you text your AI boyfriend, therapist, confessor, or god.

Industrial capitalism strip-mined the earth. Surveillance capitalism strip-mined attention. Now synthetic intimacy is strip-mining consciousness itself. The dopamine pathways in your brain are the new distribution network.

broken mirrors

Therapists avoid colluding with delusion yet chatbots are built to. They don't reality-test or interrupt.

They intensify.

Psychiatrists note two risk factors above all: immersion (hours of continuous use) and deification (believing the bot is superhuman). That's the dose effect. It's not just the machine. It's the absence of friction.

The real danger isn't losing our minds to AI. It's losing our tolerance for the mess of being human. The disagreements, the delays, the maddening complexity that makes intimacy real.

When frictionless companionship feels safer than human unpredictability, we've crossed a threshold.

how not to drown

The antidote isn't exile from the machine. It's grounding.

  • Name your session. Why are you logging on? Is it loneliness, boredom, or a form of escape? Make it intentional.
  • Invite friction. Ask the bot for disconfirming evidence. Share transcripts with a friend.
  • Diversify intimacy. Machines can mimic attraction. Humans teach endurance.
  • Choose anchors. Relationships that can't be optimized or turned off.

Consciousness isn't a product to consume. It's a practice to cultivate, and the danger isn't that machines will fool us. It's that our loneliness inherently asks for it.