Posture Analysis Through Machine Learning
- Conditions
- Health Behavior
- Registration Number
- NCT05034302
- Lead Sponsor
- SentiMetrix, Inc
- Brief Summary
This study will include video-recorded data from 20 adults (age 18-85yrs) residing in San Luis Obispo, CA. Participants will also have their height and weight measured, complete demographic questionnaires, and one 3hour session with video recordings in a combination of naturalistic condition and semi-structured environments. The video data will be used to train machine learning models to automatically classify physical behavior as compared to ground-truth measures of manual annotation.
- Detailed Description
This is a cross-sectional, single observation study. Individuals will be drawn from local surrounding clinics and the general community. All recruitment will include both men and women. Selection criteria include individuals between the ages of 18-85 years, no major chronic illness that impair mobility and able to complete activities of daily living without assistance. Participants will complete one three hour session where there will be one video camera set up within the home (i.e., static cameras). For approximately 30 minutes of the session they will complete a semi-scripted routine that will include sit to stand transitions, a timed up and go test, and scripted activities of daily living.
Researchers will use a video camera to record participant behavior within their daily life. For two of the three hours, researchers will be video recordings the participants normal (unscripted) activities. • For one hour of the session we will use two cameras, one that will be held by a researcher and one that will be set up on a tripod. During this hour we will ask participants to follow a semi-structured protocol:
* 10 minutes recording the empty space
* 10 minutes that include a timed up a go test (sit up from a chair and walk 10 feet), repeat the test 3 times.
* 6 minute walk test (walk continuously for 6 minutes)
* Four stage balance test
* The remainder of the time, participants will complete standard activities of daily living like household chores, eating or drinking.
Data will be annotated using an established behavioral observation software by training research assistants (ground-truth). The image data from videos will be used to train machine learning models to classify physical activities (e.g. ,'walking', 'sitting' or 'standing up"), information about behavior (e.g., location and purpose of the activity), and performance (e.g., walking speed and sit to stand transition times).
Recruitment & Eligibility
- Status
- UNKNOWN
- Sex
- All
- Target Recruitment
- 20
- Age 18-85 years
- No major chronic illness that impair mobility
- Able to complete activities of daily living without assistance.
Not provided
Study & Design
- Study Type
- OBSERVATIONAL
- Study Design
- Not specified
- Primary Outcome Measures
Name Time Method Sit to stand transition time Upon enrollment (one timepoint) Time it takes to go from sitting to standing
Postural Status Upon enrollment (one timepoint) Sitting versus standing versus moving
Activity type Upon enrollment (one timepoint) Indoor vs outdoor vs driving
- Secondary Outcome Measures
Name Time Method Activity type Upon enrollment (one timepoint) lying, sitting, driving, standing, housework or office work, walking, running, sports, other
Activity intensity Upon enrollment (one timepoint) Sedentary, light, moderate and vigorous intensity
Trial Locations
- Locations (1)
CalPoly
🇺🇸San Luis Obispo, California, United States