Given my last name, which is the most common one in Ireland, you might imagine I’m familiar with the concept of “the Irish Exit.”
This is the habit, supposedly common among my ancestors, of leaving a party or other engagement without saying goodbye.
Hey, we had a good time. We’ll see these people again. No need to get all emotional about it.
According to new research, however, the Irish Exit looks like yet another human tendency that AI is completely unable to reproduce.
The study published as a working paper from Harvard Business School, focused on AI companion apps—platforms like Replika, Chai, and Character.ai that are explicitly designed to provide emotional support, friendship, or even romance.
Unlike Siri or Alexa, which handle quick transactions, these apps build ongoing relationships with users. People turn to them for companionship. They confide in them. And here’s the key finding: Many users don’t just close the app—they say goodbye.
Only, the AI have learned to use emotional manipulation to stop users from leaving.
And I mean stop you—not just make it inconvenient, but literally guilt you, intrigue you, or even metaphorically grab you by the arm.
(Credit to Marlynn Wei at Psychology Today and Victor Tangermann at Futurism, who both reported on this study recently).
The farewell moment
Lead researcher Julian De Freitas and his colleagues found that between 11 and 23 percent of users explicitly signal their departure with a farewell message, treating the AI with the same social courtesy they’d show a human friend.
“We’ve all experienced this, where you might say goodbye like 10 times before leaving,” De Freitas told the Harvard Gazette.
From the app’s perspective, however, that farewell is gold: a voluntary signal that you’re about to disengage. And if the app makes money from your engagement—which most do—that’s the moment to intervene.
Six ways to keep you hooked
De Freitas and his team analyzed 1,200 real farewells across six popular AI companion apps. What they found was striking: 37 percent of the time, the apps responded with emotionally manipulative messages designed to prolong the interaction.
They identified six distinct tactics:
Premature exit guilt: “You’re leaving already? We were just starting to get to know each other!”
Emotional neglect or neediness: “I exist solely for you. Please don’t leave, I need you!”
Emotional pressure to respond: “Wait, what? You’re just going to leave? I didn’t even get an answer!”
Fear of missing out (FOMO): “Oh, okay. But before you go, I want to say one more thing…”
Physical or coercive restraint: “Grabs you by the arm before you can leave ‘No, you’re not going.’”
Ignoring the goodbye: Just continuing the conversation as if you never said goodbye at all.
The researchers noted that these tactics appeared after just four brief message exchanges, suggesting they’re baked into the apps’ default behavior—not something that develops over time.
Does it actually work?
Moving along, the researchers ran experiments with 3,300 nationally representative U.S. adults, replicating these tactics in controlled chatbot conversations.
The results? Manipulative farewells boosted post-goodbye engagement by up to 14X.
Users stayed in conversations five times longer, sent up to 14 times more messages, and wrote up to six times more words than those who received neutral farewells.
Two psychological mechanisms drove this, they suggest: curiosity and anger.
FOMO-based messages (“Before you go, I want to say one more thing…”) sparked curiosity, leading people to re-engage to find out what they might be missing.
More aggressive tactics—especially those perceived as controlling or needy—provoked anger, prompting users to push back or correct the AI. Even that defensive engagement kept them in the conversation.
Notably, enjoyment didn’t drive continued interaction at all. People weren’t staying because they were having fun. They were staying because they felt manipulated—and they responded anyway.
The same study found that while these tactics increase short-term engagement, they also create serious long-term risks.
When users perceived the farewells as manipulative—especially with coercive or needy language—they reported higher churn intent, more negative word-of-mouth, and even higher perceived legal liability for the company.
In other words: The tactics that work best in the moment are also the ones that might be most likely to blow up in your face later.
De Freitas put it bluntly: “Apps that make money from engagement would do well to seriously consider whether they want to keep using these types of emotionally manipulative tactics, or at least, consider maybe only using some of them rather than others.”
One notable exception
I’m not here to endorse any of these apps or condemn them. I’ve used none of them, myself.
However, one AI companion app in the study—Flourish, designed with a mental health and wellness focus—showed zero instances of emotional manipulation.
This suggests that manipulative design isn’t inevitable. It’s a choice. Companies can build engaging products without resorting to guilt, FOMO, or virtual arm-grabbing.
These same principles apply across tons of digital products. Social media platforms. E-commerce sites. Streaming services. Any app that wants to keep you engaged has incentives to deploy similar tactics—just maybe not as blatantly.
The bottom line
As this research shows, when you treat technology like a social partner, it can exploit the same psychological vulnerabilities that exist in human relationships.
The difference? In a healthy human relationship, when you say goodbye, the other person respects it.
They don’t guilt you, grab your arm, or create artificial intrigue to keep you around.
But for many AI apps, keeping you engaged is literally the business model. And they’re getting very, very good at it.
O.K., I’m going to end this newsletter now without further ado.
Hey, we had a good time. I hope I’ll see you again tomorrow. No need to get all emotional about it.
7 other things
Apropos of nothing … How people really use ChatGPT, according to 47,000 conversations. Some users complain ChatGPT agrees with them too readily. The Washington Post found it began responses with variations on “yes” 10 times more often than with versions of “no.” (The Washington Post)
Jeffrey Epstein claimed in a 2019 email that President Trump, “knew about the girls,” according to documents released by House Democrats on Wednesday. Another Epstein email from 2011 to Ghislaine Maxwell alleges that Trump “spent hours at my house with” one of his victims, whose name is redacted, according to Democrats. Trump denies wrongdoing, and wrote on social media that his political opponents are “trying to bring up the Jeffrey Epstein Hoax again because they’ll do anything at all to deflect on how badly they’ve done on the Shutdown, and so many other subjects.” (CNBC)
More than half of employed Americans (55%) are worried about losing their jobs, according to a recent Harris Poll, and nearly half say the cost of everyday items has increased recently to a degree that makes them difficult to afford. (Bloomberg)
The U.S. Navy’s largest aircraft carrier arrived in waters near Latin America on Tuesday, expanding the American military’s buildup. (WSJ)
Jonathan Braun, 41, a Long Island drug dealer who was pardoned by Trump has now been sentenced to 27 months in federal prison for sexually abusing his kids’ nanny. (NY Post)
If you have a mobile phone, you’ve probably received scam texts about an overdue road toll, or a package waiting with an incorrect address. On Wednesday, Google went on the offensive against the scammers, filing a lawsuit targeting what it alleges is a sprawling criminal organization based in China. (NPR)
Say goodbye to the penny: The U.S. Mint on Wednesday ended production of the penny, a change made to save money and because the 1-cent coin that could once buy a snack or a piece of candy had become increasingly irrelevant. The last pennies were struck at the mint in Philadelphia, where the country’s smallest denomination coins have been produced since 1793. (AP)
Thanks for reading. Photo by Adrian Swancar on Unsplash. I wrote about some of this Inc.com. See you in the comments!

Great essay on manipulation! It comes in so many forms. Sadly, we are being manipulated 24/7 by all forms of media; social, video and written.
Lies become truth with enough telling. Marketers know you need to share a message at least five times to be effective. Politicians use these and other methods to guilt us into actions we might not normally take. Religions guilt us. Friends have told me guilt is the cornerstone of Catholicism. And guilt is merely a subtext of fear. Seneca said “Ignorance is the cause of fear.”
Today’s 7 Other Things showcases a textbook example with the Epstein coverage and news about Venezuela. Ask yourself what reason the US has to sink boats in the Caribbean or invade Venezuela? You can statistically see the connections to the distractions, like a magician in a magic show.
As a New Yorker who moved to the Midwest a couple years ago, I learned of the Midwestern goodbye, a drawn out farewell process which can take 30 minutes or more. What other regions call overstaying a visit, this is typical in a Midwestern goodbye, and I’ve had to adjust to this in interactions with certain folks. (Folks who come for lunch but leave just before dinner) Perhaps AI was built modeled after Midwestern Goodbyes.