This intelligent patient monitoring system uses an array of wearable sensors and cameras combined with machine-learning platforms to conduct autonomous, continuous observation and analysis of patient status and environmental conditions in the intensive care unit (ICU) or assisted living facility. The market for patient monitoring systems should exceed $25 billion by 2023. In critical care environments, patients require extensive monitoring of their physiological state to inform treatment decisions and ensure proper recovery. While much of a patient’s assessment in the ICU happens automatically through various bedside devices, available monitoring systems do not automatically quantify certain relevant environmental factors, such as noise or light levels, frequency of sleep-disrupting visitations, and patient delirium. Other aspects of ICU care, such as assessing pain or physical functionality, require manual evaluations, which are quite subjective, lack precision and increase the already overburdened hospital staff workload. Additionally, the intermittent nature of manual observations can inhibit medical interventions at critical times.
Researchers at the University of Florida have developed an autonomous ICU monitoring system that captures important patient information and environmental conditions and characterizes the physiological state of patients. The system uses a network of sensors and visual criteria to monitor and automatically analyze data such as a patient’s facial expressions, visitation frequency, and the room’s light and noise levels, providing continuous and automatic quantification of ICU factors for more timely medical interventions.
Autonomous ICU patient monitoring system that continuously collects and analyzes pervasive input and physiological information in real-time
This autonomous patient monitoring system employs pervasive sensing and machine learning procedures to evaluate an ICU patient’s environment and physiological status. A network of wearable sensors, room light and noise sensors, and high-resolution cameras continuously capture data on a patients and their environment in the ICU. Machine learning and statistical software analyzes the visual information, such as patient posture, expressions, and physical movements. In conjunction with the gathered environmental conditions and with reference to the patient’s vital signs and electronic health record, this automated analysis enables the system to evaluate the physiological state of a patient, including quantifying pain levels or identifying conditions such as delirium.