Pervasive Sensing and AI in Intelligent ICU
- Conditions
- Critical IllnessPainDeliriumConfusion
- Interventions
- Other: Video MonitoringOther: Accelerometer MonitoringOther: Noise Level MonitoringOther: Light Level MonitoringOther: Air Quality MonitoringOther: EKG MonitoringOther: Vitals MonitoringOther: Biosample CollectionOther: Delirium Motor Subtyping Scale 4 (DMSS-4)
- Registration Number
- NCT05127265
- Lead Sponsor
- University of Florida
- Brief Summary
Important information related to the visual assessment of patients, such as facial expressions, head and extremity movements, posture, and mobility are captured sporadically by overburdened nurses, or are not captured at all. Consequently, these important visual cues, although associated with critical indices such as physical functioning, pain, delirious state, and impending clinical deterioration, often cannot be incorporated into clinical status. The overall objectives of this project are to sense, quantify, and communicate patients' clinical conditions in an autonomous and precise manner, and develop a pervasive intelligent sensing system that combines deep learning algorithms with continuous data from inertial, color, and depth image sensors for autonomous visual assessment of critically ill patients. The central hypothesis is that deep learning models will be superior to existing acuity clinical scores by predicting acuity in a dynamic, precise, and interpretable manner, using autonomous assessment of pain, emotional distress, and physical function, together with clinical and physiologic data.
- Detailed Description
The under-assessment of pain is one of the primary barriers to the adequate treatment of pain in critically ill patients, and is associated with many negative outcomes such as chronic pain after discharge, prolonged mechanical ventilation, longer ICU stay, and increased mortality risk. Many ICU patients cannot self-report their pain intensity due to their clinical condition, ventilation devices, and altered consciousness. The monitoring of patients' pain status is yet another task for over-worked nurses, and due to pain's subjective nature, those assessments may vary among care staff. These challenges point to a critical need for developing objective and autonomous pain recognition systems. Delirium is another common complication of patient hospitalization, which is characterized by changes in cognition, activity level, consciousness, and alertness and has rates of up to 80% in surgical patients. The risk factors that have been associated with delirium include age, preexisting cognitive dysfunction, vision and hearing impairment, severe illness, dehydration, electrolyte abnormalities, overmedication, alcohol abuse, and disruptions in sleep patterns. Estimates show that about one third of delirium cases can benefit from drug and non-drug prevention and intervention. However, detecting and predicting pain and delirium is still very limited in practice.
The aim of this study is to evaluate the ability of the investigators' proposed model to leverage accelerometer, environmental, circadian rhythm biomarkers, and video data in autonomously quantifying pain, characterizing functional activities, and delirium status. The Autonomous Delirium Monitoring and Adaptive Prevention (ADAPT) system will use novel pervasive sensing and deep learning techniques to autonomously quantify patients' mobility and circadian dyssynchrony in terms of nightly disruptions, light intensity, and sound pressure level. This will allow for the integration of these risk factors into a dynamic model for predicting delirium trajectories. Commercially available cameras will be used to monitor patients' facial expressions and contextualize patients' actions by providing imaging data to provide additional patient movement information. Commercially available environmental sensors will be used to provide data on illumination, decibel level, and air quality. Patient blood samples will help determine their circadian rhythm and compare and validate the pervasive sensing system's capabilities of autonomously monitoring circadian dyssynchrony. Electronic health record data will also be collected.
Recruitment & Eligibility
- Status
- RECRUITING
- Sex
- All
- Target Recruitment
- 400
- aged 18 or older
- admitted to UF Health Shands Gainesville ICU ward
- expected to remain in ICU ward for at least 24 hours at time of screening
- under the age of 18
- on any contact/isolation precautions
- expected to transfer or discharge from the ICU in 24 hours or less
- unable to provide self-consent or has no available proxy/LAR
Study & Design
- Study Type
- OBSERVATIONAL
- Study Design
- Not specified
- Arm && Interventions
Group Intervention Description adult ICU patients Light Level Monitoring adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards adult ICU patients Air Quality Monitoring adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards adult ICU patients Delirium Motor Subtyping Scale 4 (DMSS-4) adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards adult ICU patients Biosample Collection adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards adult ICU patients Video Monitoring adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards adult ICU patients Accelerometer Monitoring adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards adult ICU patients Noise Level Monitoring adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards adult ICU patients Vitals Monitoring adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards adult ICU patients EKG Monitoring adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
- Primary Outcome Measures
Name Time Method Decibel Levels Noise sensor data collected continuously for up to 7 days maximum. Determine relative decibel (noise loudness) levels in study patient's ICU room to alert for abnormalities in decibel level (noisiness of environment).
Air Quality Air quality sensor data collected continuously for up to 7 days maximum. Determines relative air quality pollution levels in study patient's ICU room to alert for abnormalities in room air quality.
Lux Levels Light sensor data collected continuously for up to 7 days maximum. Determine relative lux (light illumination) levels in study patient's ICU room to alert for abnormalities in illumination level.
Algorithmic Delirium Recognition Profile Data collected for up to 7 days maximum. The algorithm's output will report on whether patient is likely to be delirious or at-risk of delirium based on activity, facial expression, and circadian dyssynchrony index data collected from study devices and biosamples.
Circadian Dyssynchrony Index Change in internal circadian profile from Day 1 to Day 2. Blood and urine samples will be collected and processed to determine the presence of dyssynchrony in a subject's internal circadian clock.
Delirium Motor Subtyping Scale 4 (DMSS-4) Changes from baseline up to a maximum of 7 days Determines which subtype of delirium a subject is experiencing. This subtyping scale has 13 symptom items (5 hyperactive and 8 hypoactive) derived from the 30-item Delirium Motor Checklist. To subtype a delirious subject, at least 2 symptoms are required to be present from either the hyperactive or hypoactive checklist to meet the subtyping criteria for 'hyperactive delirium' or 'hypoactive delirium'. Patients who meet both hyperactive and hypoactive criteria are determined as 'mixed subtype', while patients meeting neither hyperactive or hypoactive criteria are labeled as 'no subtype'.
Algorithmic Activity Labeling Image frames collected continuously for up to 7 days maximum. The algorithm's output will report on which activity the patient is performing in the corresponding image data.
Algorithmic Pain Labeling Image frames collected continuously for up to 7 days maximum. The algorithm's output will report on whether the patient is experiencing pain in the corresponding image data.
- Secondary Outcome Measures
Name Time Method Mortality From baseline (study enrollment) up to a maximum of 7 days Status of alive or deceased
Trial Locations
- Locations (1)
University of Florida Health Shands Hospital
🇺🇸Gainesville, Florida, United States