San Francisco startup, Sensely, is bringing the love by using AI and digital nurses to make clinical conversations more understanding and empathetic


“My grandmother Rita has a chronic condition, and would regularly end up in hospital,” says Adam Odessky, founder and CEO of Sensely, an avatar-based, empathy-driven conversation platform. “During one of her emergency room admissions, I realised there was a pressing social need among elderly patients for help and support on a more frequent and understanding basis.”

A computer science graduate, Odessky began his career in tech right out of college. He was involved in several startup ventures in the early 2000s before moving to Chicago to work for Oracle. “Back then, Oracle and a few other companies were experimenting with voice technology,” he continues. “They wanted to develop a user interface to conversationally manage personal information. So, my role was to build the infrastructure that can support these kinds of interactive conversations.”

Odessky wondered if he could adopt the technology to make patient monitoring and interaction more effective by combining the best qualities of human interaction with the scalability of technology.

“Instead of calling a hotline, my idea was to use an interactive voice response (IVR) system to make regular, automated phone calls to patients like my grandmother and check how they are doing. But in a more understanding manner, rather than just asking questions and ticking the boxes. From there, perhaps we can inform doctors early enough for timely interventions to take place and do away the need for hospitalization.”

In May 2013, Odessky co-founded Sensely with Dr. Ivana Schnur, who specializes in the use of avatars to enhance sensitive conversations between clinicians and veterans suffering PTSD. Together, they built an application that helps physicians stay in touch with patients and prevent readmission to the hospitals. Sensely users can tell a nurse avatar (Molly) how they are doing with daily or weekly five minute ‘check ins’ on their smartphones.

“With avatars, we are taking chatbots to the next level,” Odessky says. “We want to make healthcare chatbots more human and interactive so we are now able to have multi-sensory conversations. In addition to seeing the text on the screen, users are also able to see the ‘digital human’ that they are interacting with. This ‘digital human’ is playing the role it’s supposed to play based on the context of the conversation.”

Patients can just talk to the avatar without having to type anything. What’s been shared would automatically be recorded in the medical records which only authorized health providers can review. Sensely leverages AI so its avatars are designed to speak empathetically. They respond to patients’ mood, rather than their symptoms and behaviors.

The platform’s core, rule-based engine and algorithms were written around commonly accepted medical protocols for diagnosing and dealing with chronic diseases to make it work for patients from different age groups and different medical conditions. On top of ‘check ins’, Sensely also gathers a range of data from the patients’ wearables and other internet-connected medical devices and hardware for monitoring purposes.

Sensely is currently deployed in 10 countries with about three million users. The company also partners with healthcare systems and institutions like the UK National Health Service (NHS), Mayo Clinic, Kaiser Permanente, Nippon Life and Allianz to provide customizable solutions for patient navigation and general care management. The organisation has raised $27 million since it became a Series B company in 2017, backed by the likes of Aflac Corporate Ventures, NMC and Nippon Life Insurance.

“Speech recognition and natural language processing, quite often, are just to enable people to speak and reveal what their symptoms or issues are,” says Odessky. “Beyond that, you need emotional recognition to understand whether the person is upset, angry or showing any other emotions. In healthcare, you really want to address those things. People needing health support or suffering chronic conditions are more likely to suffer other psychosocial issues like anxiety and depression. Visual recognition is so important to enable people to have access to more help than they may have just verbally asked for. It’s how we use other senses to help patients more holistically.”