Q&A: A Virtual Visit With Molly, Sense.ly’s Robot Nurse Keeps Tabs on Patients

Advertisement

A mockup of Molly the robotic nurse. Photo: Sense.ly.




Sense.ly, based in California, is a company that provides a “virtual nurse” to carry out personalized monitoring and follow-up care for patients around the clock, especially those with chronic diseases. Sense.ly and its front-end avatar, Molly, can serve as a go-between for the patients and their healthcare providers, talking to patients in plain language, monitoring their vital signs and warning doctors if patients are at high risk. Patients connect to the system through their phones or computers and can link devices such as glucometers, which Molly can then read. Companies using the system include major providors, payers and telemedicine companies. Mission Critical recently talked with company cofounder and CEO Adam Odessky.



Q: What healthcare market conditions led you to create Molly?

A: There were definitely several forces that kind of created the perfect storm. First of all, it’s the Affordable Care Act. A fundamental principle of the Affordable Care Act is moving the patient reimbursement … from a pay-per-procedure model to a pay-per-outcome model, so basically the incentives for doctors are slowly beginning to change, where they are not necessarily incentivized to see more patients and to see more patients more frequently because they get paid … based on the quantity of patients they see. But rather, they get paid based on the outcome, if the patient is getting healthy or not. … 

And the [U.S.] healthcare system is one of the more expensive systems, if not the most expensive system, in the modern industrial world, and it causes a burden for doctors and budgets alike. There are over 100 million Americans with a chronic disease out there, and doctors and budgets are struggling to cope. So one of the motivations we had is to lessen the cost burden on both the providers as well as the insurance companies to enable patients to live healthier, more productive lives. 

I would say the last trend is around the revolution in mobile technologies, just general computing that has come about in the last few years. You can now do so much with your smartphone that you couldn’t do before. You can have the smartphone call you, message you, interact with you, connect to any kind of device, like via Bluetooth or via wire, and bring all these different kind of sensors together in a fashion that would benefit the patient. 



Q: What was the thinking that led to the Molly avatar in particular?

A: We’ve done a lot of tests, looked at a lot of avatars that are out there. We looked at what worked, what didn’t work, what we liked. … We tested on some people, found what they liked, what kind of voice they preferred to hear, and through trial and error we arrived at Molly. We didn’t think actually that she was going to become that popular, but she did, so that’s good. … There’s something to be said about her calmness, her lack of judgment, which I think is important when you have these kind of doctor-patient conversations. 



Q: What connection capability do people need to have in order to interact with Molly and their physician?

A: It works both on a phone, on a smartphone, iPhone or Android. It also works within a Web browser on a computer, and for connectivity we require you to have a network connection, because she is dynamic in what she says and how she talks to you. … We both perform text-to-speech and speech recognition in the cloud, so network connectivity is required.



Q: Your mention that your technology is “future proofed” and is scalable. What does that mean?

A: We use very off-the-shelf commoditized components for infrastructure. … It’s future-proof, because we can swap components in and out without affecting the whole system. There’s some customized software, but there’s not a lot of customized hardware. As technology evolves, as interactions evolve, as the Web evolves, our software will not become obsolete, because it’s built on Web standards. 

From a scalability perspective, we can support hundreds of thousands of patients and we can grow as our customers grow. We don’t foresee a limit of usage that we can hit that will prevent the software from being expanded into other hospitals and other markets.



Q: How does Sense.ly identify patient stress remotely?

A: There are lots of different ways. There are algorithms to gauge voice tones and audio tones to look for stress patterns. There are algorithms out there to look for phrasing, in the way a person speaks and the kind of word choice they use, to identify from the word choice what kind of mood and what kind of stress they are under. … We don’t have any active pilots — just to set the record straight — using that kind of stuff right now. We see it more being used in the future and in a more experimental fashion.



Q: You have identified several partners in the healthcare field. What work are they doing with your systems?

A: They are using the system for lots of different use cases. Some are using the system for physical therapy, some are using it for behavioral health and addiction treatment. We have a couple of pilots in heart failure. And some hospitals are using this for follow-ups with patients, telemedicine, and some are using it to measure glucose for diabetes, so there are many uses. 



Q: How do the personalized solutions for various ailments, such as diabetes, work?

A: We have a conversation creator that allows medical personnel to create a customized conversation that blends in voice, speech and medical devices, so for diabetes we have some protocols developed that basically check the patient’s blood sugar by enabling them to plug in their glucometer. We read that glucometer — Molly reads that glucometer — and then we also ask them questions about diet, food choice, and calories and those kinds of things we can calculate risk against those answers. And if the risk is too high, we let the clinician know, and they can take appropriate action.



Q: Can you foresee this kind of service expanding into more healthcare realms in the future, or is this the sweet spot for it?

A: I see it going beyond health care. We work with adjunct healthcare organizations like insurance companies and pharma. Our focus is on the healthcare domain, but there is definitely the possibility for this kind of technology to be used outside the domain as well.



Q: What kind of feedback are you getting?

A: The feedback has been mostly positive, if not all positive. The patients really like Molly. They understand that she’s a robot, but she’s very pleasant, understandable. She speaks slowly enough for them to clearly retain instructions. … Clinicians really like it, because she’s able to offload a lot of these kinds of onerous tasks that they perform every day, like taking the patient’s blood pressure and weight and glucose and asking them what’s their pain level on a scale of one to five. So basically she’s there as this tertiary level of care to automate these basic tasks. You have your doctor that’s your primary care, you have your nurse that’s your secondary and does the more rigorous work, and then you have Molly, who’s the tertiary level that does the basic tasks that you don’t need a human to perform, because the tasks themselves are very scripted and basic.



Q: What sort of negative reactions have you gotten?

A: There are some patients who thought she [Molly] may be a bit too young, or too old even, and [the] kind of the feedback that gave us is we really want to give people a choice of avatars, ones that they like and ones that they kind of have affinity to. One of the things that we’re doing is we’re expanding beyond Molly to different prototypes, or avatars rather, that patients can choose or we can choose for them depending on their demographic.

<< Back to the News