MedPath

Pervasive Sensing and AI in Intelligent ICU

Recruiting
Conditions
Critical Illness
Pain
Delirium
Confusion
Interventions
Other: Video Monitoring
Other: Accelerometer Monitoring
Other: Noise Level Monitoring
Other: Light Level Monitoring
Other: Air Quality Monitoring
Other: EKG Monitoring
Other: Vitals Monitoring
Other: Biosample Collection
Other: Delirium Motor Subtyping Scale 4 (DMSS-4)
Registration Number
NCT05127265
Lead Sponsor
University of Florida
Brief Summary

Important information related to the visual assessment of patients, such as facial expressions, head and extremity movements, posture, and mobility are captured sporadically by overburdened nurses, or are not captured at all. Consequently, these important visual cues, although associated with critical indices such as physical functioning, pain, delirious state, and impending clinical deterioration, often cannot be incorporated into clinical status. The overall objectives of this project are to sense, quantify, and communicate patients' clinical conditions in an autonomous and precise manner, and develop a pervasive intelligent sensing system that combines deep learning algorithms with continuous data from inertial, color, and depth image sensors for autonomous visual assessment of critically ill patients. The central hypothesis is that deep learning models will be superior to existing acuity clinical scores by predicting acuity in a dynamic, precise, and interpretable manner, using autonomous assessment of pain, emotional distress, and physical function, together with clinical and physiologic data.

Detailed Description

The under-assessment of pain is one of the primary barriers to the adequate treatment of pain in critically ill patients, and is associated with many negative outcomes such as chronic pain after discharge, prolonged mechanical ventilation, longer ICU stay, and increased mortality risk. Many ICU patients cannot self-report their pain intensity due to their clinical condition, ventilation devices, and altered consciousness. The monitoring of patients' pain status is yet another task for over-worked nurses, and due to pain's subjective nature, those assessments may vary among care staff. These challenges point to a critical need for developing objective and autonomous pain recognition systems. Delirium is another common complication of patient hospitalization, which is characterized by changes in cognition, activity level, consciousness, and alertness and has rates of up to 80% in surgical patients. The risk factors that have been associated with delirium include age, preexisting cognitive dysfunction, vision and hearing impairment, severe illness, dehydration, electrolyte abnormalities, overmedication, alcohol abuse, and disruptions in sleep patterns. Estimates show that about one third of delirium cases can benefit from drug and non-drug prevention and intervention. However, detecting and predicting pain and delirium is still very limited in practice.

The aim of this study is to evaluate the ability of the investigators' proposed model to leverage accelerometer, environmental, circadian rhythm biomarkers, and video data in autonomously quantifying pain, characterizing functional activities, and delirium status. The Autonomous Delirium Monitoring and Adaptive Prevention (ADAPT) system will use novel pervasive sensing and deep learning techniques to autonomously quantify patients' mobility and circadian dyssynchrony in terms of nightly disruptions, light intensity, and sound pressure level. This will allow for the integration of these risk factors into a dynamic model for predicting delirium trajectories. Commercially available cameras will be used to monitor patients' facial expressions and contextualize patients' actions by providing imaging data to provide additional patient movement information. Commercially available environmental sensors will be used to provide data on illumination, decibel level, and air quality. Patient blood samples will help determine their circadian rhythm and compare and validate the pervasive sensing system's capabilities of autonomously monitoring circadian dyssynchrony. Electronic health record data will also be collected.

Recruitment & Eligibility

Status
RECRUITING
Sex
All
Target Recruitment
400
Inclusion Criteria
  • aged 18 or older
  • admitted to UF Health Shands Gainesville ICU ward
  • expected to remain in ICU ward for at least 24 hours at time of screening
Exclusion Criteria
  • under the age of 18
  • on any contact/isolation precautions
  • expected to transfer or discharge from the ICU in 24 hours or less
  • unable to provide self-consent or has no available proxy/LAR

Study & Design

Study Type
OBSERVATIONAL
Study Design
Not specified
Arm && Interventions
GroupInterventionDescription
adult ICU patientsLight Level Monitoringadult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
adult ICU patientsAir Quality Monitoringadult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
adult ICU patientsDelirium Motor Subtyping Scale 4 (DMSS-4)adult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
adult ICU patientsBiosample Collectionadult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
adult ICU patientsVideo Monitoringadult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
adult ICU patientsAccelerometer Monitoringadult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
adult ICU patientsNoise Level Monitoringadult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
adult ICU patientsVitals Monitoringadult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
adult ICU patientsEKG Monitoringadult patients aged 18 or older admitted to University of Florida Health Shands Gainesville ICU wards
Primary Outcome Measures
NameTimeMethod
Decibel LevelsNoise sensor data collected continuously for up to 7 days maximum.

Determine relative decibel (noise loudness) levels in study patient's ICU room to alert for abnormalities in decibel level (noisiness of environment).

Air QualityAir quality sensor data collected continuously for up to 7 days maximum.

Determines relative air quality pollution levels in study patient's ICU room to alert for abnormalities in room air quality.

Lux LevelsLight sensor data collected continuously for up to 7 days maximum.

Determine relative lux (light illumination) levels in study patient's ICU room to alert for abnormalities in illumination level.

Algorithmic Delirium Recognition ProfileData collected for up to 7 days maximum.

The algorithm's output will report on whether patient is likely to be delirious or at-risk of delirium based on activity, facial expression, and circadian dyssynchrony index data collected from study devices and biosamples.

Circadian Dyssynchrony IndexChange in internal circadian profile from Day 1 to Day 2.

Blood and urine samples will be collected and processed to determine the presence of dyssynchrony in a subject's internal circadian clock.

Delirium Motor Subtyping Scale 4 (DMSS-4)Changes from baseline up to a maximum of 7 days

Determines which subtype of delirium a subject is experiencing. This subtyping scale has 13 symptom items (5 hyperactive and 8 hypoactive) derived from the 30-item Delirium Motor Checklist. To subtype a delirious subject, at least 2 symptoms are required to be present from either the hyperactive or hypoactive checklist to meet the subtyping criteria for 'hyperactive delirium' or 'hypoactive delirium'. Patients who meet both hyperactive and hypoactive criteria are determined as 'mixed subtype', while patients meeting neither hyperactive or hypoactive criteria are labeled as 'no subtype'.

Algorithmic Activity LabelingImage frames collected continuously for up to 7 days maximum.

The algorithm's output will report on which activity the patient is performing in the corresponding image data.

Algorithmic Pain LabelingImage frames collected continuously for up to 7 days maximum.

The algorithm's output will report on whether the patient is experiencing pain in the corresponding image data.

Secondary Outcome Measures
NameTimeMethod
MortalityFrom baseline (study enrollment) up to a maximum of 7 days

Status of alive or deceased

Trial Locations

Locations (1)

University of Florida Health Shands Hospital

🇺🇸

Gainesville, Florida, United States

© Copyright 2025. All Rights Reserved by MedPath