MedPath

Brief Intervention at Adult Education

Not Applicable
Not yet recruiting
Conditions
Behavioral Health Challenges
Registration Number
NCT07102914
Lead Sponsor
Yale University
Brief Summary

Many Americans fail to receive their high school diploma. Individuals enrolled in Adult Education classes have exited the K-12 education system without a high school diploma. This reduces their access to economic resources, heightens their risk for poverty and poor health, limits their ability to meet occupational and social expectations of adult life, and exacerbates their stress. Behavioral health (i.e., depression, anxiety, anger, substance use) is implicated in K-12 school failure, as it negatively impacts students' acquisition of academic skills and their achievement of educational and vocational goals. Students enrolled in Adult Education Centers (AECs) are often ignored in most analyses that explore how behavioral health issues impact students' general functioning and academic outcomes, even though behavioral health challenges in AECs may be greater than that in the general population. AECs are ill equipped to address students' behavioral health challenges. Few evidence-based, behavioral health interventions are currently deployed in AECs that target the behavioral health challenges AEC students may experience. Screening, Brief Intervention, and Referral to Treatment (SBIRT), often informed by Motivational Interviewing (MI), positively impacts health outcomes. Positive outcomes are associated with the successful screening and referral to behavioral health services. In turn, these behavioral health improvements may also help to facilitate positive academic results for impacted AEC students. Implementation Facilitation is a promising strategy for ensuring the successful implementation of SBIRT in AECs. Guided by the integrated-Promoting Action on Research Implementation in Health Services (i-PARIHS) framework, the investigators propose an R34 project to: 1) conduct an iterative, mixed methods formative evaluation approach to identify barriers and facilitators of SBIRT implementation in AECs and tailor an Implementation Facilitation strategy to support the delivery of SBIRT to AEC students by SRSs; and 2) examine the acceptability, feasibility, and preliminary effectiveness of Implementation Facilitation to promote the use of SBIRT by SRSs with AEC students.

Detailed Description

Twenty-eight million (13%) adults 25 years and older in the United States, 368,000 (16%) in Connecticut, and 94,000 (17%) in New Haven County do not have a high school diploma. Behavioral health (i.e., depression, anxiety, anger, substance use) are implicated in high rates K-12 school failure. Failure to graduate from high school confers significant social, financial, and personal challenges that undermine one's ability to meet life expectations. Adult Education Centers (AECs) are designed to help individuals reenter the learning environment and achieve their educational-vocational goals (i.e., a high school diploma). AECs' observe significant student "churning" (i.e., students entering and exiting without goal achievement). Student Retention Specialists (SRSs) in AECs are charged with engaging and retaining students. SRSs, however, do not receive training in evidence-based interventions to address students' behavioral health, including referring students to specialized treatment. Such training could help the SRSs increase AEC students' successful completion of their educational-vocational goals.

Using motivational interviewing-informed Screening, Brief Intervention, and Referral to Treatment (SBIRT) within AECs, for students, holds promise to address the behavioral health challenges they experience. SBIRT can be delivered effectively by paraprofessionals with good clinical outcomes, including improvement in physical health, reductions in substance misuse, better behavioral and emotional functioning, increased self-efficacy, positive expectations for success, and increased academic motivation and continuation. How best to implement SBIRT in community settings like AECs is unknown and requires a formative evaluation process that identifies key determinants of SBIRT implementation in AECs using a modified Implementation Facilitation strategy that supports students' positive behavioral health and academic outcomes.

Guided by the integrated-Promoting Action on Research Implementation in Health Services (i-PARIHS) framework, the investigators propose an R34 project to: 1) conduct an iterative, mixed methods formative evaluation approach to identify barriers and facilitators of SBIRT implementation in AECs and tailor an Implementation Facilitation strategy to support the delivery of SBIRT to AEC students by SRSs; and 2) examine the acceptability, feasibility, and preliminary effectiveness of Implementation Facilitation to promote the use of SBIRT by SRSs with AEC students. Implementation Facilitation aligns with the factors considered in i-PARIHS, including attention to characteristics of the providers, recipients, and context.

This project will include a 1) developmental, formative evaluation before the single-arm open pilot study to identify barriers and facilitators to implementing SBIRT in an AEC, with attention to unique, contributing, factors; and 2) a progress-focused formative evaluation during the pilot study to identify factors that inform modifications needed for Implementation Facilitation in NHAEC to enhance its capacity to achieve designated study outcomes. This two-part formative evaluation will involve staff and students. Forty staff will be enrolled (20 for the developmental, 20 for the progress-focused formative evaluations) from full-time (n=32; 35+ hours week), part-time employees (n=55; \<= 19 hours week) and administrative staff (n=3; 35+ hours week). The Investigators will attend to demographic (e.g., age, race, ethnicity, sex) and schedule (daytime, evening) distributions to obtain a representative sample. All eligible staff will be invited to complete the ORCA assessments (see below) as part of the developmental and progress-focused formative evaluations. NHAEC staff will be required to have been employed by for at least one year to ensure sufficient experience with the setting.

Forty students (20 each for the developmental and progress-focused formative evaluations) will be enrolled. Inclusion criteria: ≥ 17 years and currently enrolled either the High School Credit or General Educational Development programs. Exclusion criteria: students with acute psychosis or intoxication, low intellectual functioning. For these students, study enrollment will be deferred until they are no longer impaired and are able to understand what is expected. The investigators will strive to recruit a representative sample or students across demographics (e.g., age, race, ethnicity, sex) and program type (High School Credit, General Educational Development; daytime, nighttime) using the previous year's enrollment proportions. The i-PARIHS framework will guide the development of interview guides for the staff and student focus groups and administrator key informant interviews. After consenting to participate in the focus groups, the participants will complete demographic questionnaires. They will then be introduced to the facilitators and provided a high-level summary of the background for the focus group, including its goal of understanding their perspectives. Specifically, participants will be asked about their perceptions about the intervention (e.g., degree of fit with existing practices and values, usability to achieve desired goals with noticeable results), recipient characteristics, (e.g., motivation, values, beliefs and goals, skills and knowledge, available resources, opinion leader endorsement), inner contexts (e.g., leadership support, institutional culture supporting practice change, learning environment supporting SBIRT adoption), outer contexts (e.g., NHAEC policies and priorities, inter-organizational networks and relationships), and facilitation efforts (e.g., roles and activities at NHAEC that might support SBIRT implementation). They will be informed about the expectations around confidentiality and not sharing with others what is discussed in the focus group. During the interviews, students and staff will be presented with information (e.g., NHAEC student observed behavioral health concerns, disparity between number of students registered and those that meet minimum instructional requirements) and asked to react to the information presented. Interviews with staff and students will also explore their unique perceptions of: 1) the behavioral health needs of students attending NHAEC and how these unmet needs impact students' educational achievements; 2) NHAEC students' prior experiences receiving support; 3) how to meet these needs, with consideration of SBIRT as part of the process; 4) barriers and facilitators to implementing SBIRT in AECs; and 5) factors that may contribute to challenges experienced regarding behavioral health, access to care, and educational goal attainment. Study PI and Co-I, Dr. Kaufman, will develop the interview protocol by attending to the overlapping and unique focus for each of the groups (staff, students). Key informant interviews will be conducted by trained researchers during an encrypted Zoom call. The researchers were trained in qualitative interview techniques by Co-I Kaufman. In addition, each researcher will conduct at least one practice interview with the co-I who will provide feedback. All interviews will be audio-recorded and transcribed using HIPPA compliant Zoom, with all identifiable data removed. Qualitative data analyses will be conducted by the PI and Co-I, Dr. Kaufman using an inductive, analytic approach guided by the i-PARIHS framework. The investigators will independently review the transcripts and generate initial open codes. The investigators will then meet to review the codes that emerged and develop a draft code book that will include operation definitions and instructions to employ each code. Next each researcher will independently code two transcripts and meet to discuss coding decisions and reach consensus on codes, identify new codes that emerged from the data, and finalize the codebook. Two researchers will independently code each transcript and then meet to reach consensus. Co-I Kaufman will review all transcripts to assess agreement with codes and confer with coders as needed. The codes and codebook will be shared in co-design meeting with select staff, SRSs, and administration to facilitate consensus on salient components emerging from the data. NVivo will allow for assessment of intercoder reliability, increase the credibility of the coding process, and allow the team to identify any coder drift. In addition, themes will be identified and reviewed and will continue until thematic saturation of content is achieved. The analyses procedures described will be used for both the Developmental and Progress-Focused Formative Evaluations. Associated demographic and quantitative data from the focus group participants will be stored in REDCap files. The collection of data from multiple informants, iterative processing of data collected, and analysis and use of two researchers to code transcripts, increases the creditability, transferability, dependability, and confirmability of the findings. NHAEC focus group interview staff participants will complete a questionnaire to characterize their demography and AEC work experience. All staff will be invited to complete the Organizational Readiness for Change Assessment (ORCA) measure focused on factors potentially affecting SBIRT implementation, including the respondents' perceptions of the 1) evidence for SBIRT, 2) quality of the implementation context for SBIRT, and 3) the degree to which the facilitation activities support SBIRT implementation. These scales have good internal consistency and factor structure and are based on the i-PARIHS framework. The developmental formative evaluation will only include the evidence and context sections of the ORCA, as Implementation Facilitation will not be experienced by the staff until the single-arm open pilot study has commenced. The progress-focused formative evaluation will include the evidence, context, and facilitation sections of the ORCA. The formative evaluation results will be critical for distilling Implementation Facilitation activities that work best for establishing SBIRT as a practice in AECs and evolving them with the experience gained during the pilot study. Quantitative data analyses will be conducted by the PI and Co-I Kershaw and will include summarized descriptive statistics for the demographic measures. For the ORCA assessments, the investigators will also compute descriptive statistics for the ORCA items and scale scores for the evidence, context, and facilitation scales (assessed during the progress-focused formative evaluation during the pilot feasibility study). Assuming participation of most NHAEC staff, all available data and generalized estimating equation (GEE), an extension of the generalized linear model (GLM) that will allow for the analysis of quantitative data closeted by the NHAEC program (High School Credit or General Education Development) and respondent will be used to document change in ORCA scores from the developmental to the progress-focused formative evaluation. GEE can be used in the analysis of continuous data even then they are not normally distributed. The investigators will triangulate the qualitative interviews and quantitative ORCA assessment data for both the developmental and progress-focused formative evaluations. First, the investigators will used a mixed methods approach to compare responses on the ORCA survey with cross-cutting themes represented in the interview transcripts. Second, the investigators will "follow threads," namely examine themes that cut across the two formative evaluation periods to understand factors that need further consideration and modification to best facilitate the implementation of SBIRT at NHAEC.

The proposed research focuses on developing an Implementation Facilitation approach appropriate for AEC settings. A single-arm, open pilot design maximizes the ability of the investigators to work collaboratively with NHAEC staff and administration to tailor the implementation strategy, establish the feasibility and acceptability of study methods and interventions (e.g., recruiting students, training SRSs to deliver SBIRT with integrity, response rates to assessment measures, student receipt of SBIRT, retention of SRSs and students), address the structural barriers within the AEC setting, identify the needs of the populations of interest, and gather data that points to best approaches to ensuring the study's success. Our choice to use a single site disallows testing site-level randomization that would be part of a proposed larger multisite hybrid effectiveness-implementation trail. Feasibility pilot studies do not necessarily have to resemble the final larger study design and instead are more flexible to allow for fine-tuning major study components before additional resources are expended on more rigorous, larger-scale, fully powered hybrid effectiveness-implementation trails. Because this is a pilot study, if there are unanticipated outcomes, the investigators will adapt in response. Our interests are in identifying and documenting beginning efficacy. The observations made from this study will not fully meet the threshold for statistical success. The investigators hope that our thoughtful approach will help to facilitate the proposed intervention's acceptance and use across AECs and help to advance the field's understanding of how best to meet the needs of their underserved populations. All SRSs have agreed to participate (see letters of support) in the pilot study. They will be responsible for providing the students with feedback about their PROMIS assessment scores, a brief MI-informed intervention, and referral to treatment, if needed. Eligibility criteria include employment at NHAEC and consent to participate in the study. There are no exclusion criteria as they work fulltime and are in similar positions involving frequent direct interactions with students. The investigators will also invite students to enroll in the study and assess their behavioral health needs using the PROMIS assessments short forms. Inclusion criteria: ≥ 17 years and currently enrolled in either the High School Credit or General Educational Development programs. Exclusion criteria: students with psychosis, low intellectual functioning, and or acute intoxication. For these students, study enrollment will be deferred until they are no longer impaired and are able to understand what is expected. The investigators will strive to recruit a representative sample of students across demographics (e.g., age, race, ethnicity, sex) and program type (High School Credit, General Educational Development; daytime, nighttime) using the previous year's enrollment proportions. Over three (3) academic semesters, the investigators plan to consent, enroll, and assess students. Student participant enrollment will commence two weeks prior to the beginning of the academic semester and extend two weeks into the semester during NHAEC's designated student enrollment period for the High School Credit and General Educational Development programs. The enrollment periods will occur in each of the three semesters falling within the project period. The investigators will continue to enroll students until the investigators have data on students with sub-clinical and clinical (n=200) threshold for problematic behavioral health (\~70%) and substance use (\~30%). Students willing to participate in the study will be taken to the computer lab at NHAEC where the assessment protocol will be computer loaded for completion. As part of our enrollment protocol, the investigators will attend to differences in key enrollment markers (age, race, ethnicity, sex, SES) across the two programs to ensure comparable representation. Quantitative data quality and analyses will be overseen by Dr. Gordon and Co-I Kershaw. Descriptive statistics with 95-percent confidence intervals for Reach, Adoption, Implementation, and Effectiveness outcomes will be computed. Where indicated, sample z tests will be used to test observed proportions against our proposed benchmarks (e.g., \>50%) for Reach, Adoption, and Implementation outcomes. Moreover, assuming a substantial sample of student participants, investigators will again use all available data and generalized estimating equations (GEE) with allowance for clustering data within NHAEC program and student to document change in the Effectiveness outcomes measured repeatedly from baseline to 6-mont follow-up. Time will be represented as days from baseline assessment to each follow-up assessment. The investigators will also consider change within the full intention-to-treat sample of students screened and key subgroups, like High School Credit versus General Education Development students, Latine versus Black students, male versus female students, and students with mild versus moderate to severe symptoms. Effectiveness outcomes not measured repeatedly, like continued enrollment, hours of instruction received, and credits earned will be tested against proposed benchmarks using simple t or z tests first for the full intention-to-treat sample and then to test whether each benchmark was met for key subgroups as mentioned above. Consistent with the recommendations for pilot studies to guide the implementation of behavioral interventions, our primary consideration in the quantitative analyses will be 1) feasibility of collecting these student outcomes, 2) the appropriateness of the statistical approach, and 3) estimation of the direction, magnitude, and consistency of potential effectiveness. The investigators will, however, also accept statistics with p values less than .05 as statistically significant for tests involving the full intention-to-treat sample.

The focal AEC for this project serves greater New Haven, a community with lower rates of high school graduation and limited resources offered to meet students' behavioral health needs. Study results will help identify viable approaches for implementing empirically validated interventions in AECs that may ultimately support the academic success of students.

SPECIFIC AIMS ARE TO:

1. Develop an Implementation Facilitation strategy for SBIRT in an AEC using a mixed-methods, two-part formative evaluation process guided by the i-PARIHS framework.

1. Collaborate with partners to identify barriers and facilitators to implementing SBIRT in an AEC, with attention to contributing factors.

2. Develop a tailored Implementation Facilitation strategy to promote SBIRT adoption in an AEC.

2. Examine the acceptability, feasibility, and preliminary effectiveness of Implementation Facilitation that supports the use of SBIRT by SRSs to address behavioral health needs of adult students participating in an AEC in a single-arm, open pilot study.

1. Examine the acceptability (e.g., satisfaction, useability) of Implementation Facilitation to support adoption of SBIRT in AEC among students, SRSs, other staff, and leadership.

2. Examine the feasibility of using Implementation Facilitation to support SRSs delivery of SBIRT in AEC by documenting i) the degree to which SRSs reach students with SBIRT (% screened, % received brief intervention); ii) adopt SBIRT into practice (# SRSs who use SBIRT with at least 50% of students); and iii) implement SBIRT well (% SRSs who demonstrate basic proficiency in SBIRT).

3. Examine the preliminary effectiveness of SBIRT in an AEC by documenting i) how many eligible students agree to enroll in the study (target \> 50%); ii) treatment engagement (completed brief intervention, attended at least 1 session with community provider); iii) clinically significant changes in behavioral health outcomes; and iv) AEC program attendance (i.e., receive 12hr academic instruction).

Should the use of Implementation Facilitation as a strategy to implement SBIRT by SRSs in an AEC demonstrate initial acceptability, feasibility, and preliminary effectiveness, our team will further test this approach in other AECs using a hybrid type 2 effectiveness-implementation, multi-site trial design within an R01 application.

Recruitment & Eligibility

Status
NOT_YET_RECRUITING
Sex
All
Target Recruitment
340
Inclusion Criteria
  • ≥ 17 years and currently enrolled either the High School Credit or General Educational Development programs.
  • All eligible staff will be invited to participate in this study
Exclusion Criteria
  • students with acute psychosis or intoxication, low intellectual functioning.
  • Not a staff at New Haven Adult Education

Study & Design

Study Type
INTERVENTIONAL
Study Design
SINGLE_GROUP
Primary Outcome Measures
NameTimeMethod
Patient-Reported Outcomes Measurement Information System (PROMIS) - Depressionpre-intervention, day 1; 3; 6 months

Behavioral measure of depression symptoms, Mild symptoms (T-score of 55\<\>60), moderate (T-score of 60\<\>65), and severe (T-score of 66\<) clinical thresholds

Patient-Reported Outcomes Measurement Information System (PROMIS) - Anxietypre-intervention, day 1; 3; 6 months

Behavioral health measure, Mild symptoms (T-score of 55\<\>60), moderate (T-score of 60\<\>65), and severe (T-score of 66\<) clinical thresholds

Patient-Reported Outcomes Measurement Information System (PROMIS) - Angerpre-intervention, day 1; 3; 6 months

Behavioral Health Screening and Tracking measure, Mild symptoms (T-score of 55\<\>60), moderate (T-score of 60\<\>65), and severe (T-score of 66\<) clinical thresholds

Patient-Reported Outcomes Measurement Information System (PROMIS) - Smokingpre-intervention, day 1; 3; 6 months

Behavioral Health measure, Mild symptoms (T-score of 55\<\>60), moderate (T-score of 60\<\>65), and severe (T-score of 66\<) clinical thresholds

Patient-Reported Outcomes Measurement Information System (PROMIS) - Substance Usepre-intervention, day 1; 3; 6 months

Behavioral Health measure, Mild symptoms (T-score of 55\<\>60), moderate (T-score of 60\<\>65), and severe (T-score of 66\<) clinical thresholds

Patient-Reported Outcomes Measurement Information System (PROMIS) - Alcohol Usepre-intervention, day 1; 3; 6 months

Behavioral Health measure, Mild symptoms (T-score of 55\<\>60), moderate (T-score of 60\<\>65), and severe (T-score of 66\<) clinical thresholds

Secondary Outcome Measures
NameTimeMethod
Client Satisfaction Questionnaire - 8 (CSQ-8)pre-intervention, day 1

Satisfaction measure, Total scores range from 8 to 32, with higher scores indicating greater client satisfaction

Academic attendance3 months post intervention and 6 months post intervention

Academic engagement, Hours of Academic instruction received. 12 hours is minimum for designation of academic engagement

Intervention Usability Scale3 months post intervention and 6 months post intervention

Measures usability and learnability of health interventions, Each item is rated on a scale of 0 to 4, with half the items reverse-scored. The final score ranges from 0 to 100, where higher scores indicate better usability

MedPath

Empowering clinical research with data-driven insights and AI-powered tools.

© 2025 MedPath, Inc. All rights reserved.