I. The Founder
Sol Kennedy used to ask his assistant to read the messages his ex-wife sent him. After the couple separated in 2020, Kennedy says, he found their communications ātough.ā An email, or a stream of them, would arriveāstuff about their two kids mixed with unrelated emotional wallopsāand his day would be ruined trying to reply. Kennedy, a serial tech founder and investor in Silicon Valley, was in therapy at the time. But outside weekly sessions, he felt the need for real-time support.
After the coupleās divorce, their communications shifted to a platform called OurFamilyWizard, used by hundreds of thousands of parents in the United States and abroad to exchange messages, share calendars, track expenses. (OFW keeps a time-stamped, court-admissible record of everything.) Kennedy paid extra for an add-on called ToneMeter, which OFW touted at the time as āemotional spellcheck.ā As you drafted a message, its software would conduct a basic sentiment analysis, flagging language that could be āconcerning,ā āaggressive,ā āupsetting,ā ādemeaning,ā and so on. But there was a problem, Kennedy says: His co-parent didnāt seem to be using her ToneMeter.
Kennedy, ever the early adopter, had been experimenting with ChatGPT to ācocreateā bedtime stories with his kids. Now he turned to it for advice on communications with his ex. He was wowedāand he wasnāt the first. Across Reddit and other internet forums, people with difficult exes, family members, and coworkers were posting with shock about the seemingly excellent guidance, and the precious emotional validation, a chatbot could provide. Here was a machine that could tell you, with no apparent agenda, that you were not the crazy one. Here was a counselor that would patiently hold your hand, 24 hours a day, as you waded through any amount of bullshit. āA scalable solutionā to supplement therapy, as Kennedy puts it. Finally.
But fresh out of the box, ChatGPT was too talkative for Kennedyās needs, he saysāand much too apologetic. He would feed it tough messages, and it would recommend replying (in many more sentences than necessary) Iām sorry, please forgive me, Iāll do better. Having no self, it had no self-esteem.
Kennedy wanted a chatbot with āspine,ā and he thought that if he built it, a lot of other co-parents might want it too. As he saw it, AI could help them at each stage of their communications: It could filter emotionally triggering language out of incoming messages and summarize just the facts. It could suggest appropriate responses. It could coach users toward āa better way,ā Kennedy says. So he founded a company and started developing an app. He called it BestInterest, after the standard that courts often use for custody decisionsāthe ābest interestā of the child or children. He would take those off-the-shelf OpenAI models and give them spine with his own prompts.
Estranged partners end up fighting horribly for any number of reasons, of course. For many, perhaps even most, things cool down after enough months have gone by, and a tool like BestInterest might not be useful long-term. But when a certain kind of personality is in the mixācall it āhigh-conflict,ā ānarcissistic,ā ācontrolling,ā ātoxic,ā whatever synonym for ācrazy-makingā you tend to see cross your internet feedāthe fighting about the kids, at least from one side, never stops. Kennedy wanted his chatbot to stand up to these people, so he turned to the one they may hate most: Ramani Durvasula, a Los Angelesābased clinical psychologist who specializes in how narcissism shapes relationships.