The University of Washington researchers have developed a new tool to monitor individuals for cardiac arrest while they are sleeping without touching them. A new skill for a smart speaker - similar to Amazon Alexa and Google Home - or smartphone lets the device detect the gasping sound of agonal breathing and call for help. The scientists designed the proof-of-concept tool with the use of real agonal breathing instances captured from 911 calls, detected agonal breathing events 97 percent of the time from up to 20 feet (or 6 meters) away. The researchers published their findings in npj Digital Medicine.
An associate professor in the UW's Paul G. Allen School of Computer Science& Engineering and also the co-corresponding author of the study, Shyam Gollakota, said that several people have smart speakers in their homes, and these devices have amazing capabilities of which people can take advantage. People envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event, and alerts anyone nearby to provide CPR. And if then there is no response, the device can automatically call 911.
According to 911 call data, for individuals that experience cardiac arrests, there is the presence of agonal breathing, and patients who take agonal breaths most times have a better chance of surviving.
Dr. Jacob Sunshine, the co-corresponding author of the study and an assistant professor of anesthesiology and pain medicine at the UW School of Medicine, said that this kind of breathing happens when a patient experiences quite a low oxygen levels. It's a sort of guttural gasping noise, and its uniqueness makes it an excellent biomarker to use to identify if someone is experiencing a cardiac arrest.
Using real 911 calls to Seatle's Emergency Medical Services, the team was able to gather sounds of agonal breathing. Since cardiac arrest patients are often unconscious, bystanders recorded the agonal breathing sounds by putting their phones up to the mouth of the patient so that the dispatcher could determine whether the patient needed immediate CPR. Between 2009 and 2017, the team collected 162 calls and extracted 2.5 seconds of audio at the start of each agonal breath to come up with a total of 236 clips. The researchers captured the recordings on different smart devices, an Amazon Alexa, an iPhone 5s, and a Samsung Galaxy S4, and used various machine learning techniques to boost the dataset to 7,316 positive clips.
The team is planning to commercialize this technology through a UW spinout, Sound Life Sciences, Inc. Sunshine noted that cardiac arrests are quite a common way for people to die, and at present many of them can go unwitnessed. Part of what makes this technology so compelling is that it could help us match more patients in time for them to be treated.