This startup aims to improve conversations in the workplace with “empathy as a service” software
Most of us have had the experience of texting or emailing that seemed callous or angry, even though that wasn’t our intention.
Unfortunately, the lack of social cues in such posts makes it much easier to be misinterpreted. Depending on the communication, it can lead to misunderstandings, hurt feelings or worse. It’s a loophole that the Bellevue, Wash. Based mpathic. wants to correct using empathetic AI.
Building on the knowledge and data sets gathered over the past decade, mpathic has set itself the goal of promoting human connection and understanding in the workplace.
To do this, they created plugins related to their Empathy as a Cloud-Based Service, or EaaS, to help humans talk to humans using real-time text corrections. This way texts and emails can be viewed and changes can be suggested before clicking âSendâ. By adding these features to platforms like Slack or Gmail, mpathic hopes to bring more empathy to the business communication landscape.
âWe realized that all of this could be mediated by an AI empathy engine, almost like Grammarly for empathy,â said co-founder Grin Lord. âWe have had some amazing developments in AI that allow us to do this now in real time, making this the first time in human history that we can achieve empathy correction by real time which is dynamic. “
In an example from a recent pitch, the service suggested replacing an inflammatory message like, “Why does Nic always schedule these meetings at the last minute?” Am I right? âWith a more open-ended question:â What do you think about the change in the meeting? â
Based on years of research into human interaction, mpathic offers a unique approach to guide users. Lord, who has a PhD in psychology, initially based the mpathic data set on knowledge she gained from researching in the early 2000s at Seattle’s Harborview Medical Center, the only level trauma center I in Washington State.
Meanwhile, Lord was part of a research group on empathetic listening. Following a car accident, DUI drivers would be brought to Harborview frequently. Rather than handing out flyers to the drivers or telling them what to do or shaming them, the researchers would listen to them, perhaps for 15 or 20 minutes, according to specific protocols. In a randomized controlled trial, they found a measurable decrease in alcohol consumption among drivers that lasted for up to three years, as well as a 48% reduction in hospital readmissions. Not only did this help the subject to recover, but it led to significant cost reductions, as well as greater public safety.
Since then, Lord has been involved with other startups including Lyssn, a platform to assess the empathy and engagement of behavioral health providers during clinical sessions.
Ahead of its launch, the team behind mpathic launched Empathy Rocks, which establishes a human connection using empathic AI via a gamified platform. The platform allows practitioners to improve their empathetic listening skills while earning continuing education credits.
But it was at the start of Empathy Rocks’ seed funding phase that Lord and his co-founder Nic Bertagnolli realized they already had a viable product in the underlying empathy engine of this platform. form. Swiveling, they launched mpathic to make the engine easier and more widely available.
Developing both Grammarly for Empathy and an API, mpathic wants to do more than just foster good relationships between employees. Given the increasing globalization of many companies and the growing pool of employees from other parts of the country and the world, mpathic is keen to provide human resources departments with a tool that can help facilitate employee onboarding. Since different regions have different ideas and attitudes about what constitutes civilian and sensitive behavior, mpathic can be used to help onboard new hires into their new team more quickly.
Lord is quick to point out that mpathic not only suggests text corrections, but also makes other kinds of behavioral suggestions. In this way, the user builds an understanding of empathic communication and behavior through context, use and repetition.
âWe’re actually making some very behavioral fixes,â Lord said. âSo it may not even be a replacement of a word or a transformation of the text. Instead, the AI ââmay suggest calling a meeting or making a phone call, as some things don’t need to be in an email.
Although mpathic originated from Empathy Rocks, the gamified training platform continues to provide empathic listening training as it acquires new data that is used to train mpathic’s EAaS. The platform was designed by the team’s empathy designer, Dr Jolley Paige, who notes the many factors that need to be taken into account at a time when AI biases are such a concern.
âWe were thinking about gender, age, culture, where you are in the country, but also different abilities,â Jolley said. “So if someone has a language processing disorder, what impact would that have on the way they interact with this game?” “
While some people may have concerns about using AI to change human behavior, many companies see the value of such an approach. âSome of our early business partners are considering connecting mpathic to their Slack, Gmail or whatever, mainly because they are interested in this idea of ââquickly joining cross-cultural and global teams,â Lord said. “I think this can be helpful in unifying the language of mission values ââfor a business.”
Last month, mpathic was one of 14 startups that presented the PIE Demo Day. PIE (Portland Incubator Experiment) is led by CEO Rick Turoczy and seeks to provide founders – often first-time entrepreneurs – with access to mentorship and networks.
Empathy Rocks and mpathic intentionally research and retain their data to include under-represented voices and are part of All Tech is Human and other communities committed to the ethical development of AI.
Empathetic AI is part of a much larger field of computing, originally known as affective computing and more recently referred to as emotional AI or artificial emotional intelligence. Emerged from the MIT Media Lab and other research institutes around 25 years ago, emotional AI involves systems that can read, interpret, and interact with human emotions. Since emotion and above all empathy are at the heart of the human condition, such work has the potential to make our technologies interact more easily, humanely and responsibly with people, both at home and in the place. of work.