Developing and Testing a Social Network Data Capture Tool to Improve Partner Services: a Preliminary Pilot Implementation
- Conditions
- Implementation ScienceContact TracingPublic Health System Research
- Registration Number
- NCT06659003
- Lead Sponsor
- Northwestern University
- Brief Summary
This study conducted a preliminary pilot implementation which integrated our existing social network software - Network Canvas - into Chicago Partner Services in order to understand the feasibility and acceptability of this integration, and to gather preliminary evidence of potential efficacy in improving Partner Services metrics.
All of this work will be conducted through an Active Implementation Framework in which we utilized a staged- approach and strong engagement with local (e.g., Chicago Department of Public Health and Howard Brown Health) and national stakeholders to explore, install, and implement a software pilot into Partner Services.
- Detailed Description
This study conducted a preliminary pilot implementation which integrated our existing social network software - Network Canvas - into Chicago Partner Services in order to understand the feasibility and acceptability of this integration, and to gather preliminary evidence of potential efficacy in improving Partner Services metrics. Specifically, the satisfaction of clients, of DIS, and of public health officials will be assessed. Furthermore, we will track implementation quality and sustainability. Finally, preliminary evidence of efficacy will be captured through a pre-/post-test historical control design evaluating the data quality, the proportion of partners notified, and the timeliness of interviewing, testing and linkage to care.
PRIMARY OUTCOMES Feasibility: We will assessed the feasibility of our initial implementation of Network Canvas. We define feasibility here as satisfaction with and usability of all major areas of our initial reconfiguration of Network Canvas for Howard Brown. For example, we will measured the satisfaction with and usability of our touchscreen interfaces, our procedures for deployment and data management, and important logistical supports such as our training materials, the support provided by our team, and our procedures for evaluation. Data came from KIIs where stakeholders at multiple levels within Howard Brown and CDPH responded to open-ended questions about implementation, usability, and ease of use, as well as complete a modified form of the User Satisfaction and Usability Measure to assess their perspective on the feasibility of Network Canvas. We will further assess feasibility by monitoring program data on uptake.
Acceptability: Our first measure of acceptability will be Fidelity. We will assess the fidelity or quality of implementation of our initial implementation of Network Canvas. We define fidelity here as adherence to interview protocols, deployment protocols, and data workflow protocols. Data will come from a mixture of staff self-ratings of adherence during KIIs and periodic supervision meetings, and researcher ratings of adherence by our staff. The assessment of implementation quality is not just an important outcome in year 3 for us to demonstrate acceptability; it will also serves as a feedback loop to ensure high quality implementation. Further, by beginning assessment in year 2, our team will have the opportunity to adjust and improve our strategy for implementation as well. For example, we may find it necessary to refine our training materials, or the design of our software, or our strategy for coaching and supervision.
Our second measure of acceptability will be Sustainability - or the extent to which our program is able to persist and what ongoing supports would be necessary from our team. Like fidelity, early conversations about sustainability will serve as a feedback loop that will allow our team to first identify the important drivers of implementation in Chicago and other health departments, and then shape our initial design to be sustainable. Conversations about sustainability will occur as early as year 1. Sustainability will primarily be assessed via KIIs with individuals in leadership positions. For those outside of Chicago, we will share Chicago implementation protocols to understand how sustainability may differ across the country.
SECONDARY OUTCOME Preliminary Evidence of Efficacy: To provide us with preliminary evidence of the efficacy of Network Canvas on Partner Service Program outcomes, we will work closely with Howard Brown and CDPH to analyze anonymized administrative data. While we anticipate that our requested metrics will be largely standard for CDPH's analysts, as the majority are tracked for yearly reporting to the Centers for Disease Control and Prevention (CDC), our team may need to run analyses ourselves. If that occurs, as data received by our team will be completely stripped of identifiers, this work will not be considered human subjects research. All analyses of efficacy will utilize a pre-/post-test measurement - specifically a historical control design - where we will compare program data obtained in the years prior (excluding 2020 due to COVID-19) to program data obtained during the one-year period during full implementation. We have chosen a number of metrics based primarily on the CDC's 18-1802 Evaluation and Performance Measurement Plan for Partner Services, as well as two other important studies which contributed to defining important process and outcome measures of Partner Services (Bernstein 2014 \& Marcus 2009). These HIV and program metrics span data quality, the proportion of partners that are notified, and timeliness.
Recruitment & Eligibility
- Status
- COMPLETED
- Sex
- All
- Target Recruitment
- 23
- (i) age 18+ years old and (ii) key informant connected to administration or implementation of Partner Services including but not limited to a Partner Services Staff Member (DIS or a DIS Manager), a Surveillance Staff Member, a federally funded STD/HIV Program Director, an Informational Technology Staff Member responsible for the technical systems which support or are connected to Partner Services, and individuals who have in the past been enrolled and interviewed within Partner Services.
- None
Study & Design
- Study Type
- OBSERVATIONAL
- Study Design
- Not specified
- Primary Outcome Measures
Name Time Method Feasibility 12 months We will assess the feasibility of our initial implementation of Network Canvas. We define feasibility here as satisfaction with and usability of all major areas of our initial reconfiguration of Network Canvas for CDPH. For example, we will measure the satisfaction with and usability of our touchscreen interfaces, our procedures for deployment and data management, and important logistical supports such as our training materials, the support provided by our team, and our procedures for evaluation. Data will come from KIIs where stakeholders at multiple levels will respond to open-ended questions about implementation, usability, and ease of use, as well as complete a modified form of the User Satisfaction and Usability Measure to assess their perspective on the feasibility of Network Canvas.
Acceptability 12 months Our first measure of acceptability will be Fidelity. We will assess the fidelity or quality of implementation of our initial implementation of Network Canvas. We define fidelity here as adherence to interview protocols, deployment protocols, and data workflow protocols. Data will come from a mixture of staff self-ratings of adherence and periodic supervision meetings, and researcher ratings of adherence by our staff. The assessment of implementation quality serves as a feedback loop to ensure high quality implementation, so our team has the opportunity to adjust and improve our strategy for implementation. Our second measure of acceptability will be Sustainability - or the extent to which our program is able to persist and what ongoing supports would be necessary from our team.
- Secondary Outcome Measures
Name Time Method Preliminary Evidence of Efficacy 12 months To provide us with preliminary evidence of the efficacy of Network Canvas on Partner Service Program outcomes, we worked closely with Howard Brown and CDPH to obtain anonymized administrative data. These metrics are largely standard for CDPH's analysts, as the majority are tracked for yearly reporting to the Centers for Disease Control and Prevention (CDC). All obtained data will be completely stripped of identifiers, therefore this work will be considered exempt. All analyses of efficacy will utilize a pre-/post-test measurement - specifically a historical control design - where we will compare program data obtained in the years prior 2019 - 2022 (excluding 2020 due to COVID-19) to program data obtained during the one-year period of implementation. Specific program metrics include data quality, number of partners notified, and timeliness.
Trial Locations
- Locations (1)
Howard Brown Health
🇺🇸Chicago, Illinois, United States
Howard Brown Health🇺🇸Chicago, Illinois, United States