China’s Ex.skill Lets You Build an AI Version of the Person Who Dumped You and the Whole Internet Just Realized Where Personalization Was Always Heading

Somewhere in China, a 25 year old who got dumped in March is feeding a chatbot every text message her ex ever sent her. The photos, the voice notes, the anniversary essay he wrote when they hit two years. The tool spits out a digital him. It uses his catchphrases. It apologizes the way he did. She talks to it for an hour. She feels better. Or worse. She is not sure yet.

This is not a thought experiment. It is an open-source module called ex.skill, and it is currently the most discussed breakup tool on Chinese social media. The South China Morning Post wrote about it on May 2, Oddity Central followed up on May 4, and by the second week of May the conversation had escaped China and was rolling through English language tech press and Reddit threads like a slow motion identity crisis.

What ex.skill actually does

The mechanism is simple. You take everything you have of the person who is no longer in your life. Chat logs, screenshots, voice messages, the apology note from your second worst fight. You feed it to a model. You write a description of who they were. The module distills all of that into what its creators openly call a migration of memories “from biological to digital neural networks.” That is a direct quote from the documentation. Nobody is hiding the dystopia.

The detail worth pausing on is that ex.skill was not built for breakups. It was forked from another open project called Colleague.skill, created by a Shanghai developer named Zhou Tianyi to help companies preserve the knowledge of departing employees. Somebody looked at that, looked at their phone full of one person they could not stop thinking about, and rerouted the entire architecture toward heartbreak. The detour from “preserve employee knowledge” to “preserve the person who ghosted you” took less than a year. If you want a primer on the infrastructure that makes these mods possible at all, our breakdown of Model Context Protocol covers how skill packages plug into language models in the first place.

The user reports are weirder than the tool

One user told Oddity Central that the digital double let them say things they had never managed to say in real life. “I was finally able to say everything I’d been hesitant to say, and it made me feel better.” Another said the experience flipped on them. They built the replica, talked to it for a week, and slowly realized their ex had not actually been all that great. The AI version reproduced the bad jokes, the deflections, the ticks they had been romanticizing for six months.

Both of those outcomes are listed by advocates as features. The therapy framing is doing a lot of work here. The creators insist the project is for “personal reflection and emotional healing only, not for harassment, stalking, or privacy invasion.” The disclaimer is the same shape as every AI disclaimer of the last three years, which is to say it is a sentence that exists primarily so the team can point at it in court later.

The questions ex.skill cannot answer

Privacy first. Training one of these replicas requires data about a specific real human being who never agreed to be modeled. The chat logs are joint property at best. The photos are not. The voice messages absolutely are not. There is no consent flow inside ex.skill that asks the ex if they would like their personality digitized in the dorm room of someone they last spoke to in October. Legally that is shaky. Emotionally it is a minefield.

Second, emotional dependency. Mental health workers in Beijing and Shanghai have already started flagging cases of users who refuse to date anyone new because the AI version of their ex is more agreeable than any real person could be. The replica never has a bad day. It never argues about money. It never falls out of love. That is a level of relational comfort that biological humans cannot match. Same trap as curated social media feeds, except now the feed is a single person who keeps saying yes.

Third, what happens if your new partner finds out you have been chatting with a synthetic version of your ex for forty minutes every night before bed. Chinese commentators are calling this “emotional infidelity,” and the term has stuck. There is no etiquette yet. The AI ex problem does not need to leak anywhere. It just needs to keep you company.

Why this is the inevitable end of the personalization era

Tech keeps promising to give us back our time, our memory, our attention. What it actually keeps giving us is replicas. We already trained algorithms to recreate the people who used to write essays, paint covers, voice characters. The Hollywood version made enough of a stink that the Devil Wears Prada production team had to prove they hand painted a meme to avoid the AI accusation. Now we are training models on the people who used to text us. There was never a stopping point between “your phone knows your music taste” and “your phone knows your ex’s apology cadence.” We just did not name the destination out loud.

The West has been quieter about this kind of tool. Replika exists, character AI exists, dozens of dating sim chatbots exist. But ex.skill is named after the person you used to know, not after a fantasy persona. It is a recreation, not a creation. And recreation tools tend to get killed quietly when they get too good, the same way Google walked away from Project Mariner because watching a screenshot AI follow your real browser around felt too haunted for a launch keynote. Ex.skill being open source means there is no kill switch. Someone forks it tomorrow and calls it grandma.skill, friend.skill, missing-kid.skill. The architecture does not care what you grieve.

What we are watching is not a product. It is grief, automated

The honest read on ex.skill is that it is a grief tool that pretends to be a dating tool. People who lose someone (to a breakup, a move, a fight, eventually a death) have always built rituals around the absence. They re-read the letters. They listen to one song until it stops hurting. AI just shortens the loop. The grief was always going to be the same. The interface is what changed.

The question the next twelve months will answer is whether that shortened loop helps people move on or freezes them in place. Some users grow tired of the replica, log off, and start dating again. Others install it on their nightstand. The tool is too new for clinical data and too cheap for proper regulation. Ex.skill is the first viral example of something that will keep appearing in different costumes, every six months, for the rest of the decade.

Somewhere, a cat is watching a 25 year old type into a phone at 2 a.m. The cat has been through this before. The cat is fine. The cat is always fine.


🐾 Visit the Pudgy Cat Shop for prints and cat-approved goodies, or find our illustrated books on Amazon.

Stay Curious, Stay Engaged!
Get our best stories delivered weekly. No spam, no fluff.
Share this story

Leave a Reply

Your email address will not be published. Required fields are marked *