Article Text

Download PDFPDF

Implementation evaluation of multiple complex early years interventions: an evaluation framework and study protocol
  1. Nimarta Dharni1,
  2. Josie Dickerson1,
  3. Kathryn Willan1,
  4. Sara Ahern1,
  5. Abigail Dunn2,
  6. Dea Nielsen2,
  7. Eleonora Uphoff2,
  8. Rosemary R C McEachan1,
  9. Maria Bryant3
  1. 1 Born in Bradford, Bradford Institute for Health Research, Bradford, UK
  2. 2 University of York, York, UK
  3. 3 University of Leeds, Leeds, UK
  1. Correspondence to Dr Nimarta Dharni; nimarta.dharni{at}bthft.nhs.uk

Abstract

Introduction Implementation evaluations are integral to understanding whether, how and why interventions work. However, unpicking the mechanisms of complex interventions is often challenging in usual service settings where multiple services are delivered concurrently. Furthermore, many locally developed and/or adapted interventions have not undergone any evaluation, thus limiting the evidence base available. Born in Bradford’s Better Start cohort is evaluating the impact of multiple early life interventions being delivered as part of the Big Lottery Fund’s ‘A Better Start’ programme to improve the health and well-being of children living in one of the most socially and ethnically diverse areas of the UK. In this paper, we outline our evaluation framework and protocol for embedding pragmatic implementation evaluation across multiple early years interventions and services.

Methods and analysis The evaluation framework is based on a modified version of The Conceptual Framework for Implementation Fidelity. Using qualitative and quantitative methods, our evaluation framework incorporates semistructured interviews, focus groups, routinely collected data and questionnaires. We will explore factors related to content, delivery and reach of interventions at both individual and wider community levels. Potential moderating factors impacting intervention success such as participants’ satisfaction, strategies to facilitate implementation, quality of delivery and context will also be examined. Interview and focus guides will be based on the Theoretical Domains Framework to further explore the barriers and facilitators of implementation. Descriptive statistics will be employed to analyse the routinely collected quantitative data and thematic analysis will be used to analyse qualitative data.

Ethics and dissemination The Health Research Authority (HRA) has confirmed our implementation evaluations do not require review by an NHS Research Ethics Committee (HRA decision 60/88/81). Findings will be shared widely to aid commissioning decisions and will also be disseminated through peer-reviewed journals, summary reports, conferences and community newsletters.

  • implementation science
  • process evaluation
  • early years interventions
  • prevention
  • infancy
  • pregnancy
  • child health
  • maternal health
  • inequalities

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

What is already known on this topic?

Early years interventions are integral to improving the life chances for children and reducing inequalities in health and well-being. However, there is a dearth of evidence examining the impact of interventions, especially those that have been developed and/or adapted for local contexts.

What this study hopes to add?

Our focus on implementation presents a pragmatic and consistent approach to evaluating multiple early years interventions, including those deemed as not yet ready for evaluations of effectiveness. The mixed-methods approach and use of routinely collected data provide an efficient, feasible and manageable evaluation framework that can be easily embedded within services as they are being delivered.

Introduction

The early years of life are integral to promoting positive outcomes throughout the lifespan.1 Women’s health in pregnancy and the first two years of their children’s lives have been identified as critical periods in children’s emotional, cognitive and physical development.2–4 Early years interventions are therefore crucial to reduce inequalities and ensure the health and well-being of children as they grow. However, many preventative interventions delivered by early years services have not been subjected to rigorous development and evaluation, thus leaving them without a robust evidence base.5–8

Better Start Bradford is a Big Lottery–funded programme that has commissioned and implemented over 20 early years interventions into existing practice in three deprived and ethnically diverse inner-city wards of Bradford. The interventions aim to improve social and emotional development, communication and language development, and nutrition and health in children 0–4 years old.8 9 The limited availability of evidence of effect for many early years interventions means that the majority of Better Start Bradford interventions are considered as being ‘science based’ with some in the foundational stages of development and/or evaluation5 10 (see table 1 for further details about the interventions and Better Start Bradford9).

Table 1

Interventions commissioned for delivery as part of the Better Start Bradford programme

Born in Bradford’s Better Start (BiBBS) experimental birth cohort was established to provide independent effectiveness evaluations for these early years interventions through planned controlled experiments and using quasi-experimental methods. However, effective interventions are those that show a positive outcome on key outcomes and also those that are able to recruit and engage participants and can be delivered with fidelity in usual service settings. It is therefore critical to conduct implementation evaluations to provide evidence of the feasibility, reach, context and short-term impact of interventions.6 11 Furthermore, implementation or process evaluations can help allude to the transferability of interventions, providing local commissioners and service providers with guidance on the practical measures they can take to successfully embed interventions within their settings and communities.6

This paper describes a framework and protocol to evaluate the implementation of interventions being delivered as part of the Better Start Bradford programme. Our evaluation framework can be used by researchers, practitioners, commissioners and service providers across multiple settings to evaluate the quality of implementation of early years interventions being delivered in usual settings and maximise potential for more intensive levels of evaluation.

Methods

Conceptual framework

Underpinned by the Medical Research Council guidance on process evaluations of complex interventions, our implementation evaluations draw on the conceptual framework for implementation fidelity (figure 1).12 13

Figure 1

Modified conceptual framework for implementation fidelity Carroll et al 2007, Hasson et al 2010.

Fidelity, termed as adherence, is defined as a combination of content, frequency and duration of delivery, and coverage.12 13 Examining fidelity therefore seeks to establish the extent to which the active ingredients of the intervention were delivered as often and for as long as planned.12 13 Also included in the framework are potential moderators of implementation process and fidelity such as intervention complexity, participant responsiveness (including engagement and satisfaction), quality of delivery and strategies that facilitate implementation. Context and recruitment were later added as potential moderators in the modified framework.12 The moderators are proposed to be intrinsically linked to each other as well as to implementation fidelity.

Adoption of the conceptual model of implementation fidelity will help glean the factors affecting the implementation of the Better Start Bradford interventions (independently and collectively) and, in turn, examine their impact on outcomes as interventions increase in their potential for evaluation of impact. While drawing on published examples12 14 we plan to apply the framework consistently across multiple interventions with much of the data collection being integrated in the routine delivery of interventions to yield an efficient and pragmatic approach to evaluation. Table 2 outlines the evaluation framework including the overarching research questions, corresponding data source and method of collection.

Table 2

Implementation and process evaluation key elements and research questions within Better Start Bradford

Data collection

Data for the implementation evaluation will be derived from a number of sources:

Quantitative data collected by intervention teams

Prior to the implementation of each intervention, a service design process takes place in collaboration with commissioners, intervention delivery teams, academic researchers and other stakeholders including health professionals and community representatives to ensure each intervention meets the needs of the local population. During this process, recruitment targets and process and outcome data to be collected by intervention teams throughout the delivery period and submitted quarterly to the research team are also agreed. A guide to the service evaluation process and templates including a minimum dataset for implementation evaluation is available on our website.15

Satisfaction questionnaires

We have developed a brief six-item satisfaction questionnaire to capture participants’ satisfaction across all interventions (see online supplementary additional file 1). Questions are based on the key constructs of commonly used patient satisfaction surveys,16 17 but have been adapted following advice from our Community Research Advisory Group (CRAG), composed of local parents and volunteers alongside intervention team managers, commissioners and the research team to ensure acceptability for the local community. This process resulted in a questionnaire that is brief and uses visual cues and simple language that can be easily understood and translated into other languages.

Supplemental material

Semi-structured interviews and focus groups

Semistructured interviews and/or focus groups, where appropriate, will be undertaken with intervention participants and delivery teams to allow more in-depth exploration of elements of the conceptual model. Topic guides will be based on the Theoretical Domains Framework (TDF).18 19 The TDF encompasses a comprehensive range of constructs from theories of behaviour change including beliefs about capabilities, knowledge, skills, emotions and social influences. Furthermore, use of the TDF provides a firm theoretical basis to allow understanding of the mechanisms of action as well as the barriers and facilitators of implementation.20 It has been extensively applied to investigate and address implementation problems.20 While the interview questions may differ by intervention, use of the TDF ensures the underlying theoretical concepts explored in all interviews are explored using a consistent approach.

All studies will include data from sources 1 and 2. In-depth qualitative work may be triggered in response to issues identified by interventions such as difficulties in engaging families from particular ethnic groups, low completion rates and priorities highlighted by the commissioning team.

Eligibility

Inclusion criteria for all participants are listed in box 1.

Box 1

Inclusion criteria for all participants

(a) Intervention participants

  • Reside in a postcode within the Better Start Bradford area

  • Are enrolled to attend a Better Start Bradford intervention OR

  • Eligible but declined to take part, or dropped out (where relevant)

  • Agree for their data to be shared with the research team for evaluation purposes

  • Agree to be contacted by the research team, where further qualitative studies are planned

(b) Intervention staff, volunteers, stakeholders and/or commissioners

  • Work/volunteer for an intervention or are actively involved in commissioning or delivering an intervention

  • Have delivered at least one full intervention according to the intervention delivery schedule (intervention delivery teams only)

  • Agree to take part in an interview/focus group/observation

Exclusion criteria

For qualitative studies, participants who have completed an interview/focus group within the past 12 months will not be approached to take part in a second study to avoid unnecessary burden.

Sample size & selection

Quantitative process and demographic data will be collected for all participants who consent for their data to be shared with the evaluation team. Sample sizes will not be determined in advance as this will be dependent on the uptake of interventions.

For qualitative evaluation, a purposive sampling method will be used to identify and recruit participants representing key characteristics including ethnicity, number of children and primary language to represent the different ethnic and cultural groups in Bradford. Other characteristics will be included based on the key objectives of the intervention as defined during the service design process, for example, maternal mental well-being for interventions relating to social and emotional health, and body mass index for interventions relating to nutrition. We will continue to recruit until we reach data saturation, with an estimate of 20–30 interview participants per intervention. Where focus groups are undertaken, we would aim to recruit 8–10 participants per group, with potentially separate focus groups depending on participants’ ethnicity, gender, primary language and/or neighbourhood area. For staff/volunteers, we aim to interview a minimum of 5/6 people per intervention, depending on the size of the intervention delivery team.

Recruitment

Quantitative intervention data and satisfaction questionnaires

Quantitative data will be collected from all participants as a part of the standard service provision.

Qualitative studies

Participants will be identified and approached by one of two methods:

1. By researchers directly from the BiBBS cohort

As part of the consent process for our experimental BiBBS cohort study, expectant parents consent to being contacted in the future to learn about participation in further studies. Researchers will write to consenting parents, attaching a cover letter and information sheet about qualitative studies and contact them via telephone within 2 weeks. Where a participant speaks a language other than English, the initial phone call will be made by a bilingual researcher or an interpreter.

2. By intervention delivery staff/managers

Intervention coordinators will be asked to circulate the study information sheet to all participants who have enrolled onto an intervention. Coordinators will share the names and contact details of those individuals who are willing to be interviewed with the research team. Staff and volunteers will first be approached by their service managers, and if they agree, their contact details will be passed onto the research team.

In both methods of approach, for those interested in taking part, we will check their eligibility (as described above) and check they have read and understood the information sheet. A convenient date/time/place for an interview will be confirmed if agreed.

Consent

Quantitative data sharing

All intervention participants are given a privacy notice (that complies with the General Data Protection Regulation (GDPR)) when they enrol for an intervention that explains that data will be shared for service evaluation. The privacy notice and consent form clearly explain how participants can opt out of data sharing/withdraw their consent at any time. All forms were developed with the English language and literacy abilities of the service participants in mind and with guidance from both our CRAG and members of the wider community; the simplified version of the privacy notice has a Flesch reading score of 61 and deemed to be easily understood by individuals aged 12 and above.21 Information is also available in print in Urdu, Bengali and Slovakian, and is translatable to any other language through the Better Start Bradford website.22

Qualitative data sharing

We will obtain written informed consent from all participants prior to commencing interviews/focus groups. For interviews with non–English-speaking participants, the information sheet and consent form will be explained by an interpreter. At every stage of the research process, the right of participants to refuse consent without giving a reason will be respected.

Data management

Quantitative intervention and satisfaction questionnaire data

Data-sharing agreements will be in place between intervention teams, Better Start Bradford and the research team before data are shared. For those participants who have agreed to data sharing, the intervention teams will share individual-level, identifiable data with the research team using secure transfer methods. Personally identifiable data items will be removed and replaced with a unique intervention identification number prior to analysis.

Qualitative data

Audio recordings will be uploaded to an encrypted and secure network and will be deleted following transcription and verification of transcripts.

Confidentiality

All data shared will be strictly confidential and held securely for the duration of the Better Start Bradford programme. We will comply with all aspects of the GDPR,23 abide by the Caldicott principles and work within NHS Information Governance requirements. Anonymised data and transcripts will be available to the research team for the purposes of service evaluation only.

Analysis

Quantitative intervention and satisfaction questionnaire data

Data will be summarised using descriptive statistics, including frequencies, summary statistics, CI estimates and ranges for continuous variables (eg, participant age, referral and recruitment rate, attrition), and proportions/percentages for categorical variables (eg, ethnicity, intervention completion). The analysis will also explore whether there are any differences in referrals, recruitment rates, intervention reach, attendance and satisfaction between different groups of participants, for example, by parity, ethnicity and spoken English proficiency.

Qualitative data

Qualitative data will be analysed using thematic analysis (TA), a widely used method in evaluative studies which seeks and reports patterns inherent within the data.24 TA was chosen as it allows for an understanding of the data to be developed and patterns within the thoughts and views of participants to be examined. Specific barriers and enablers influencing implementation and satisfaction of interventions will be coded according to the TDF.18 19 We will also explore any patterning of themes by individuals’ ethnicity, socioeconomic circumstances and English-language ability. Transcripts will be coded systematically and iteratively until the analysis team are satisfied that the emerging framework adequately captures the data and saturation has been achieved. Ten per cent of the transcripts will be coded by a second researcher to maintain reliability of the coding framework. Any disagreements will be resolved through discussion and revisiting the coding framework. Data will be managed within the Nvivo data management program (NVivo qualitative data analysis software; QSR International Pty Ltd).

Patient and public involvement

Community involvement is integral to the ethos of BiBBS and Better Start Bradford. As such, we have set up a CRAG composed of local community representatives including parents, volunteers, councillors and leaders of local groups and charities. The group have been involved in every stage of development including in setting the overall evaluation objectives, development of information sheets, consent forms and satisfaction questionnaires. The CRAG will continue to advise on the development and refinement topic guides, methods for engaging local parents as well as playing a key role in the interpretation and dissemination of findings.

Dissemination

Findings will be disseminated widely to aid commissioning decisions and ensure shared learning with local partners. Findings will also be shared at local and national conferences, relevant public health events and via publication in academic journals. Finally, summaries of key findings will be shared with participants and the local community via our CRAG, newsletters and on the Born in Bradford website.25

Discussion

In this paper, we have outlined our framework for implementation evaluation across multiple, complex early years interventions. This framework has so far proved invaluable to ensure consistent and manageable data collection across all interventions as well as identifying and resolving issues in the quality of routinely collected data. Through the cyclical transfer of knowledge, findings from our implementation evaluations may also help delivery teams respond to any challenges identified and further optimise the delivery and reach of their interventions.

Understanding the key components of interventions and local context are integral to ensuring the successful implementation of public health interventions. However, setting up evaluations for interventions being delivered as part of usual practice is challenging, particularly where adaptations are required to ensure interventions can be integrated into the complex systems they are being delivered in. Through partner working with a wide range of stakeholders including service providers, commissioners and community representatives, our evaluation approach will also consider the role of contextual factors, delivery procedures, acceptability and scalability of the interventions. We have shared our learning on the practicalities of translating research into practice, the challenges encountered and the strategies adopted to address them elsewhere.26

Our implementation evaluation framework and associated tools15 are designed to be sustainable beyond our involvement as external service evaluators to allow commissioners and intervention teams to continue monitoring and evaluating the implementation of their services. While in-depth qualitative evaluation may still require input from researchers, the rest of this framework can be applied by service providers and commissioners to embed pragmatic evaluation within the delivery of services while taking positive steps towards building a robust evidence base for early years interventions.

Acknowledgments

We are grateful to all the participants, all members of the Community Research Advisory Group, Born in Bradford staff, the Better Start Bradford staff and projects, health professionals and researchers who have supported the development and set-up of this evaluation study.

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
  25. 25.
  26. 26.

Footnotes

  • Contributors ND, JD, KW, AD, SA, DN, RRCM and EU contributed to the design of the study, were involved in drafting this manuscript, approved the final version of this manuscript and agree to be accountable for this work.

  • Funding This study has received funding through a peer-review process from the Big Lottery Fund as part of the A Better Start programme.

  • Disclaimer The Big Lottery Fund have not had any involvement in the design or writing of the study protocol.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Ethics approval The protocol for the BiBBS cohort including consent to contact for other studies has been approved by Bradford Leeds NHS Research Ethics Committee (15/YH/0455). Research governance approval has been provided from Bradford Teaching Hospitals NHS Foundation Trust. The Health Research Authority has confirmed that our implementation evaluations do not require review by an NHS Research Ethics Committee (HRA decision 60/88/81). However, we will adhere to all ethical principles in the conduct of our evaluations and written informed consent will be obtained from all participants prior to qualitative interviews and/or focus groups.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement There are no data in this work.