MedPath

Prediction of Anxiety and Memory State

Recruiting
Conditions
Anxiety
Epilepsy
Memory
Interventions
Other: CAMERA (Context-Aware Multimodal Ecological Research and Assessment)
Registration Number
NCT06551090
Lead Sponsor
Columbia University
Brief Summary

The purpose of this study is to look at how signals in the brain, body, and behavior relate to anxiety and memory function. This project seeks to develop the CAMERA (Context-Aware Multimodal Ecological Research and Assessment) platform, a state-of-the-art open multimodal hardware/software system for measuring human brain-behavior relationships.

The R61 portion of the project is designed to develop the CAMERA platform, which will use multimodal, passive sensor data to predict anxiety-memory state in patients undergoing inpatient monitoring with intracranial electrodes for clinical epilepsy, as well as to build CAMERA's passive data framework and active data framework.

Detailed Description

CAMERA will record neural, physiological, behavioral, and environmental signals, as well as measurements from ecological momentary assessments (EMAs), to develop a continuous high-resolution prediction of a person's level of anxiety and cognitive performance. CAMERA will provide a significant advance over current methods for human behavioral measurement because it leverages the complementary features of multimodal data sources and combines them with interpretable machine learning to predict human behavior. A further distinctive aspect of CAMERA is that it incorporates context-aware, adaptive EMA, where the timing of assessments depends on the subject's physiology and behavior to improve response rates and model learning. In this study, CAMERA focuses on predicting anxiety state and concurrent memory performance, but the platform is flexible for use in various domains.

Currently, it is challenging to study complex, longitudinal relationships between the brain, body, and environment in humans. Most existent tools do not allow the investigator to measure transient internal states or cognitive functions comprehensively or continuously. Instead the investigators typically rely on sparsely collected and constrained self-reports or experimental constructs, including EMA.

Recruitment & Eligibility

Status
RECRUITING
Sex
All
Target Recruitment
40
Inclusion Criteria
  • Patients must have known or suspected Temporal Lobe Epilepsy.
  • Native or proficient in speaking English or Spanish.
  • Stereoelectroencephalography (sEEG) cases: The implant plan must include hippocampal head, body, and tail electrodes either unilaterally or bilaterally.
  • 7th grade reading level (minimum level considered literate for adults)
Exclusion Criteria
  • Hearing impaired (i.e., not corrected with a hearing aid)
  • Unable to read the newspaper at arm's length with corrective lenses.
  • Objective intellectual impairment (estimated IQ < 70)
  • Any history of Electroconvulsive Therapy or psychosis (except postictal psychosis for patients)
  • Psychotic disorder (lifetime)
  • Current Anxiety disorder, Major Depressive Disorder, or Bipolar Disorder
  • Neurodegenerative diseases, presence of widespread brain lesions, language problems (other than naming difficulty)
  • Medical conditions that could potentially affect cognitive performance (e.g., human immunodeficiency virus (HIV) infection, cancer with metastatic potential).
  • Acute renal failure or end-stage renal disease

Study & Design

Study Type
OBSERVATIONAL
Study Design
Not specified
Arm && Interventions
GroupInterventionDescription
CAMERACAMERA (Context-Aware Multimodal Ecological Research and Assessment)Adult subjects with epilepsy will undergo noninvasive video Electroencephalography (EEG) and intracranial electrodes sampling the amygdala and hippocampus (unilateral or bilateral). A subset of subjects (n=10) will use the Context-Aware Multimodal Ecological Research and Assessment (CAMERA) platform for 2 weeks after discharge with a subset of modalities: physiologic wristband, smartphone phenotyping, ecological momentary assessment (EMA) surveys, and memory task. At unpredictable intervals, CAMERA will interrupt subjects with: (a) an audible alarm to elicit an acoustic startle response; (b) a self-reported anxiety state scale; and (c) a visuospatial memory task with threat interference. For example, participants will fill out a brief survey and play a video game several times each day and wear a wristband with sensors.
Primary Outcome Measures
NameTimeMethod
Percent of subjects demonstrating improvement in the EMANet prediction over time.1-30 days

Use EMANet to predict ≥1 ecological momentary assessment (EMA) anxiety-memory state outcome (target) demonstrating improvement over time as measured with a linear regression applied to the mean absolute error between predicted and actual EMA values measured over days. Prediction must use ≥2 different passive modalities, showing significantly better prediction accuracy than either of the modalities alone.

Mean absolute error between predicted and actual ecological momentary assessment (EMA) scores1-30 days

Use a multimodal machine learning model (EMANet ) to predict ≥1 EMA anxiety-memory state outcome (target) in held-out data at the population level. Mean absolute error will be the mean difference in absolute value of predicted EMA and actual EMA scores. A higher mean error represents a less accurate prediction. Prediction must use ≥2 different passive modalities, showing significantly better prediction accuracy than either of the modalities alone.

Secondary Outcome Measures
NameTimeMethod
Mean absolute error between predicted and actual absolute error on a daily basis1-30 days

Use a multimodal machine learning model of prediction uncertainty (UncertaintyNet) to predict the mean absolute prediction error of ecological momentary assessment (EMA) predictions in held-out data, at single-subject level on each day. Mean absolute error will measure the difference between the predicted error (based on all available data) and the actual error.

Trial Locations

Locations (1)

Columbia University

🇺🇸

New York, New York, United States

© Copyright 2025. All Rights Reserved by MedPath