Technology used to only deliver our messages. Now it wants to write them for us by understanding our emotions.

In May 2021, Twitter, a platform notorious for abuse and hot-headedness, rolled out a “prompts” feature that suggests users think twice before sending a tweet. The following month, Facebook announced AI “conflict alerts” for groups, so that admins can take action where there may be “contentious or unhealthy conversations taking place.” Email and messaging smart-replies finish billions of sentences for us every day. Amazon’s Halo, launched in 2020, is a fitness band that monitors the tone of your voice. Wellness is no longer just the tracking of a heartbeat or the counting of steps, but the way we come across to those around us. Algorithmic therapeutic tools are being developed to predict and prevent negative behavior.

Jeff Hancock, a professor of communication at Stanford University, defines AI-mediated communication as when “an intelligent agent operates on behalf of a communicator by modifying, augmenting, or generating messages to accomplish communication goals.” This technology, he says, is already deployed at scale.

Beneath it all is a burgeoning belief that our relationships are just a nudge away from perfection. Since the start of the pandemic, more of our relationships depend on computer-mediated channels. Amid a churning ocean of online spats, toxic Slack messages, and infinite Zoom, could algorithms help us be nicer to each other? Can an app read our feelings better than we can? Or does outsourcing our communications to AI chip away at what makes a human relationship human?Coding Co-Parenting

You could say that Jai Kissoon grew up in the family court system. Or, at least, around it. His mother, Kathleen Kissoon, was a family law attorney, and when he was a teenager he’d hang out at her office in Minneapolis, Minnesota, and help collate documents. This was a time before “fancy copy machines,” and while Kissoon shuffled through the endless stacks of paper that flutter through the corridors of a law firm, he’d overhear stories about the many ways families could fall apart.

In that sense, not much has changed for Kissoon, who is cofounder of OurFamilyWizard, a scheduling and communication tool for divorced and co-parenting couples that launched in 2001. It was Kathleen’s concept, while Jai developed the business plan, initially launching OurFamilyWizard as a website. It soon caught the attention of those working in the legal system, including Judge James Swenson, who ran a pilot program with the platform at the family court in Hennepin County, Minneapolis, in 2003. The project took 40 of what Kissoon says were the “most hardcore families,” set them up on the platform—and “they disappeared from the court system.” When someone eventually did end up in court—two years later—it was after a parent had stopped using it.

Two decades on, OurFamilyWizard has been used by around a million people and gained court approval across the US. In 2015 it launched in the UK and a year later in Australia. It’s now in 75 countries; similar products include coParenter, Cozi, Amicable, and TalkingParents. Brian Karpf, secretary of the American Bar Association, Family Law Section, says that many lawyers now recommend co-parenting apps as standard practice, especially when they want to have a “chilling effect” on how a couple communicates. These apps can be a deterrent for harassment and their use in communications can be court-ordered.

In a bid to encourage civility, AI has become an increasingly prominent feature. OurFamilyWizard has a “ToneMeter” function that uses sentiment analysis to monitor messages sent on the app— “something to give a yield sign,” says Kissoon. Sentiment analysis is a subset of natural language processing, the analysis of human speech. Trained on vast language databases, these algorithms break down text and score it for sentiment and emotion based on the words and phrases it contains. In the case of the ToneMeter, if an emotionally charged phrase is detected in a message, a set of signal-strength bars will go red and the problem words are flagged. “It’s your fault that we were late,” for example, could be flagged as “aggressive.” Other phrases could be flagged as being “humiliating” or “upsetting.” It’s up to the user if they still want to hit send.

ToneMeter was originally used in the messaging service, but is now being coded for all points of exchange between parents in the app. Shane Helget, chief product officer, says that soon it will not only discourage negative communication, but encourage positive language too. He is gathering insights from a vast array of interactions with a view that the app could be used to proactively nudge parents to behave positively toward each other beyond regular conversations. There could be reminders to communicate schedules in advance, or offer to swap dates for birthdays or holidays—gestures that may not be required but could be well-received.

CoParenter, which launched in 2019, also uses sentiment analysis. Parents negotiate via text and a warning pops up if a message is too hostile—much like a human mediator might shush their client. If the system does not lead to an agreement, there is the option to bring a human into the chat.

Deferring to an app for such emotionally fraught negotiations is not without issues. Kissoon was conscious not to allow the ToneMeter to score parents on how positive or negative they seem, and Karpf says he has seen a definite effect on users’ behavior. “​​The communications become more robotic,” he says. “You’re now writing for an audience, right?”

Co-parenting apps might be able to help steer a problem relationship, but they can’t solve it. Sometimes, they can make it worse. Karpf says some parents weaponize the app and send “bait” messages to wind up their spouse and goad them into sending a problem message: “A jerk parent is always going to be a jerk parent”. Kisson recalls a conversation he had with a judge when he launched the pilot program. “The thing to remember about tools is that I can give you a screwdriver and you can fix a bunch of stuff with it,” the judge said. “Or you can go poke yourself in the eye.

courtesy wired.com