Replicating and Scaling Up Evidence-Based Home Visiting Programs: The Role of Implementation Research


Mathematica Policy Research, USA
, Rev. ed.

PDF version

Introduction

Over the past two decades, a growing number of home visiting programs have been developed and implemented in North America and internationally to support parents with young children. In the US, home visiting programs for families with pregnant women and young children operate in all 50 states, the District of Columbia, 5 territories, and 22 tribal communities, with an estimated 335,000 families receiving  more than 3.7 home visits.1 The majority of these programs implement home visiting models that are evidence-based, meaning that they have interventions based on rigorous evaluation; some programs also implement emerging models that do not yet have rigorous evidence to support their implementation.1

Over the past decade, the US government has substantially increased funding for evidence-based home visiting models. In 2010, the US Congress included the Maternal, Infant, and Early Childhood Home Visiting Program (MIECHV) in the Patient Protection and Affordable Care Act (ACA) as a national strategy for improving the health and well-being of families with pregnant women and children ages birth to 5. The ACA provided grants to states and stipulated that at least 75 percent of the funds must be spent on home visiting models with evidence of effectiveness based on rigorous evaluation. In 2019, the US Congress reauthorized MIECHV at $400 million a year for an additional 5 years. In the field of home visiting, an increasing number of programs have been rigorously evaluated and have demonstrated evidence of effectiveness in outcome domains such as parenting, maternal and child health, child development and school readiness, reductions in child maltreatment, and family economic self-sufficiency.2,3,4,5 As of 2020, the US Department of Health and Human Services identified 21 home visiting programs with rigorous evidence of effectiveness.6

Subject

Identifying core components of interventions found to be effective and understanding what it takes to implement those components with fidelity to the program model is critical to successful replication and scale-up of effective programs and practices in different community contexts and populations.7 There is growing recognition in the early childhood field of the importance of effective implementation and the need for implementation research that can guide adoption, initial implementation, and ongoing improvement of early childhood interventions.8,9,10 The promise of implementation research and using data to drive program management is compelling because it offers a potential solution to the problem of persistent gaps in outcomes between at-risk children and their more well-off peers. This article discusses implementation research in the home visiting field, how such research can be used to strengthen programs and improve targeted outcomes, and the conditions and supports necessary for effective implementation.

Problems

Simply adopting an evidence-based home visiting program and meeting the initial start-up requirements of the model developer is not enough to ensure that it will produce the positive effects for children and families found in evaluation research.11 Home visiting services should be implemented with fidelity to the program model. For example, home visitors should have required qualifications, visits should occur at the intended frequency and duration, visit content should be delivered as intended, and the quality of services provided to families should be high. Moreover, service providers need adequate supports and resources to sustain implementation with a high degree of fidelity over time.12,13

Research Context

While the body of rigorous research on the effectiveness of home visiting programs has grown substantially in recent years, research on implementation lags behind.10,14 Research reports and articles typically provide only minimal information about how programs are implemented and their fidelity to the program model.10 As national and local governments, communities and service providers seek to scale up the use of evidence-based home visiting programs, research is needed to develop program fidelity standards and measures, understand the conditions necessary for high-fidelity implementation, and create tools to assess implementation and support program improvement.

Key Research Questions

This review is designed to address two questions:

  1. What do we know about fidelity of implementation in evidence-based home visiting programs?
  2. What conditions and resources are necessary to support and sustain high-fidelity implementation over time?

Recent Research Results

What do we know about fidelity of implementation in evidence-based home visiting programs?

Researchers have developed a number of theoretical frameworks that define implementation fidelity.15,16,17 Most include adherence to the program model, dosage, quality, and participants’ responsiveness and engagement in services; some include the quality of participant-provider relationships.

While research on fidelity in home visiting programs is fairly sparse, studies have documented some components, such as dosage and duration of services, home visit content, and participant-provider relationships. Research shows that families typically receive roughly half of the number of home visits expected.12,18,19 Research also shows that many, perhaps most, families enrolled in home visiting programs drop out before their eligibility ends.12,20,21 Some home visiting studies have varied the dosage that families were offered and found that fewer home visits produced outcomes similar to higher levels of exposure.22

Systematic study of activities and topics discussed during home visits is essential for understanding whether content was delivered as intended and how content varies across families and over time. While most programs provide curriculum guidelines and training for home visitors, research suggests that content is not always delivered as planned and varies across families. For example, multiple studies have found that, despite program objectives that emphasize parenting, little time or emphasis was placed on parent-child interactions.23,24 A study of Early Head Start found that, on average, home visitors spent 14 percent of each home visit on activities designed to improve parent-child interactions.25 Fidelity frameworks also emphasize the importance of developing positive participant-home visitor relationships, since these relationships may influence the extent of parent engagement and involvement in home visits.12,20,26,27 Some research indicates that higher-quality relationships are associated with better outcomes for children.28,29

What conditions and resources are necessary to support and sustain high-fidelity implementation over time?

Best practice and emerging research suggest that home visiting staff need training, supervision and fidelity monitoring, a supportive organizational climate, and mental health supports to sustain high-fidelity implementation over time.20 The effect of these kinds of supports have not been well studied, but some research on similar interventions indicates implementation of evidence-based practices with fidelity monitoring and supportive consultation predicts lower rates of staff turnover, as well as lower levels of staff emotional exhaustion relative to services as usual.30,31,32 Moreover, a supportive organizational climate has been associated with more positive attitudes toward adoption of evidence-based programs.32

Research Gaps

More research is needed to guide decisions about adoption, adaptation and replication, and support scale-up of evidence-based home visiting programs. For example, research is needed to determine the thresholds of dosage and duration of services necessary to positively affect family and child outcomes. Planned variation studies, in which program components, content, home visitor training, or dosage of services is varied, can identify core dimensions of implementation that are critical for achieving program impacts, as well as dimensions that could be adapted for different contexts and populations without threatening the program’s effectiveness.

To facilitate these studies, more work is needed to develop implementation measures. While some measures have been developed – such as observational measures of home visiting quality and scales for assessing the participant-home visitor relationship – their validity and reliability have not been sufficiently tested with different populations and service delivery contexts.20,33,34,35

Conclusions

As interest in the promise of evidence-based home visiting programs to improve outcomes for children and families grows, policymakers and practitioners need guidance about how to implement them effectively and sustain high-fidelity implementation over the long term. While the body of implementation research on home visiting programs is growing, more work is needed. Research shows that most programs do not deliver the full dosage of services intended, and families often drop out of programs before their eligibility ends. Variation also exists in adherence to intended activities and topics covered during home visits. Emerging research points to the importance of supportive supervision, fidelity monitoring, and organizational climate to support home visitors and maintain support for the evidence-based program. Additional research on these topics can provide guidance and tools for promoting successful implementation of evidence-based home visiting and adaptation of program models to different populations and contexts.

Implications for Parents, Services and Policy

Supporting high-fidelity implementation of evidence-based home visiting programs has the potential to improve outcomes for at-risk children and families. Policymakers and funders should use the available research on implementation and encourage future work to guide decisions about how to scale up evidence-based programs effectively and support them over time. For example, implementation research can be used to assess the readiness of local agencies to implement home visiting programs with fidelity. Government and other funders can use implementation research to structure requirements for monitoring and reporting on specific dimensions of implementation. Government and funders at all levels can support these efforts by creating data systems to facilitate fidelity monitoring and use of data for program improvement. Moreover, implementation research can inform staff training and ongoing technical assistance. For parents, the implication is that participation and engagement matter. Parents must understand the goals of the program they are enrolling in and the expectations for taking up and participating in services. To achieve intended dosage, program staff may need to help parents address barriers to their participation.

Researchers should continue building the knowledge base about how to implement home visiting programs effectively by reporting information on implementation alongside results of rigorous effectiveness evaluations. Additional research on the replication and scale-up of home visiting programs should be conducted to identify the conditions, processes, and supports associated with achieving and sustaining high-fidelity implementation.

References

  1. National Home Visiting Resource Center. 2020 Home Visiting Yearbook. Arlington, VA: James Bell Associates and the Urban Institute; 2020.

  2. Avellar SL, Supplee L. Effectiveness of home visiting in improving child health and reducing child maltreatment. Pediatrics 2013; 132 Suppl 2:S90-S99.

  3. Filene J, Kaminski J, Valle L, Cachat P. Components associated with home visiting program outcomes: A meta-analysis. Pediatrics 2013;132 Suppl 2: S100-S109.

  4. Peacock S, Konrad S, Watson E, Nickel D, Muhajarine H. Effectiveness of home visiting programs on child outcomes: A systematic review. BMC Public Health 2013;13:17.

  5. Supplee L, Paulsell D, Avellar S. What works in home visiting programs? In: Nelson K, Scheitzer D, eds. What Works in Child Welfare. Washington, DC: Child Welfare League of America Press; 2012:39-61.

  6. HomVEE Team. Early childhood home visiting: reviewing evidence of effectiveness. OPRE Report #2020-126. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Human Services. 2020.

  7. Fixsen DL, Blase KA, Naoom SF, Wallace F. Core implementation components. Research on Social Work Practice 2009;19(5):531-540.

  8. Avellar S, Paulsell D. Lessons learned from the home visiting evidence of effectiveness review. Princeton, NJ: Mathematica Policy Research; 2011.

  9. Kaderavek JN, Justice LM. Fidelity: an essential component of evidence-based practice in speech-language pathology. American Journal of Speech-Language Pathology 2010;19(4):369-379.

  10. Paulsell D, Del Grosso P, Supplee L. Supporting replication and scale-up of evidence-based home visiting programs: Assessing the implementation knowledge base. American Journal of Public Health 2014;104(9): 1624-1632.

  11. Durlak JA, DuPre EP. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology 2008;41(3-4):327-350.

  12. Boller K, Daro D, Del Grosso P, Cole R, Paulsell D, Hart B, Coffee-Bordon B, Strong D, Zaveri H, Hargreaves M. Making replication work: Building infrastructure to implement, scale up, and sustain evidence-based early childhood home visiting programs with fidelity. Washington, DC: Children’s Bureau, Administration for Children and Families, U.S. Department of Health and Human Services; 2014.

  13. Hargreaves M, Cole R, Coffee-Borden B, Paulsell D, Boller K. Evaluating infrastructure development in complex home visiting systems. American Journal of Evaluation 2013;34(2):147-169.

  14. Supplee LH, Metz A. Opportunities and challenges in evidence-based social policy. SRCD Social Policy Report 2015;28(4):1-16.

  15. Daro D. Replicating evidence-based home visiting models: A framework for assessing fidelity. Princeton, NJ: Mathematica Policy Research; 2010.

  16. Carroll C, Patterson M, Wood S, Booth A, Rick J, Balian S. A conceptual framework for implementation fidelity. Implementation Science 2007;2:40.

  17. Berkel C, Mauricio AM, Schoenfelder E, Sandler IN. Putting the pieces together: An integrated model of program implementation. Prevention Science 2010;12(1):23-33.

  18. Kitzman HJ. Effective Early Childhood Development Programs for Low-Income Families: Home Visiting Interventions During Pregnancy and Early Childhood. In: Tremblay RE, Boivin M, Peters RDeV, eds. Spiker D, Gaylor E, topic eds. Encyclopedia on Early Childhood Development [online]. https://www.child-encyclopedia.com/home-visiting/according-experts/effective-early-childhood-development-programs-low-income-families. Published: February 2004. Accessed January 18, 2022.

  19. Riley S, Brady AE, Goldberg J, Jacobs F, Easterbrooks MA. Once the door closes: Understanding the parent-provider relationship. Children and Youth Services Review 2008;30(5):597-612.

  20. Duggan A, Portilla XA, Filene JH, Crown SS, Hill CJ, Lee H, Knox V. Implementation of Evidence-Based Early Childhood Home Visiting: Results from the Mother and Infant Home Visiting Program Evaluation. OPRE Report #2018-76A. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. 2018.

  21. Love JM, Kisker EE, Ross CM, Schochet PZ, Brooks-Gunn J, Paulsell D, Brady-Smith C. Making a difference in the lives of infants and toddlers and their families: The impacts of Early Head Start. Princeton, NJ: Mathematica Policy Research; 2002.

  22. DePanfilis D, Dubowtiz H. Family connections: A program for preventing child neglect. Child Maltreatment 2005;10(2):108-123.

  23. Peterson, C. A., Luze, G. J., Eshbaugh, E. M., Jeon, H. J., & Kantz, K. R. Enhancing parent-child interactions through home visiting: Promising practice or unfulfilled promise? Journal of Early Intervention 2007;29:199-140.

  24. Hebbeler KM, Gerlach-Downie SG. Inside the black box of home visiting: A qualitative analysis of why intended outcomes were not achieved. Early Childhood Research Quarterly 2002;17(1):28-51.

  25. Vogel CA, Boller K, Xue Y, Blair R, Aikens N, Burwick A, Stein J. Learning as we go: A first snapshot of Early Head Start programs, staff, families, and children. OPRE Report #2011-7. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. 2011.

  26. Korfmacher J, Green B, Spellmann M, Thornburg KR. The helping relationship and program participation in early childhood home visiting. Infant Mental Health Journal 2007;28(5):459-480.

  27. Korfmacher J, Green B, Staerkel F, Peterson C, Cook G, Roggman L, Faldowski RA, Schiffman, R. Parent involvement in early childhood home visiting. Child Youth Care Forum 2008;37(4):171-196.

  28. Peterson CA, Roggman LA, Stearkel F, Cook G, Jeon HJ, Thornburg K. Understanding the dimensions of family involvement in home-based Early Head Start. Unpublished manuscript. Iowa State University, Ames, Iowa. 2006.

  29. Roggman LA, Christiansen K, Cook GA, Jump VK, Boyce LK, Peterson CA. Home visits: Measuring how they work. Logan, UT: Early Intervention Research Institute Mini-Conference. 2006.

  30. Aarons GA, Palinkas IA. Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research 2007;34(4):411-419.

  31. Aarons GA, Sommerfeld DH, Hecht DB, Silovsky JF, Chaffin MJ. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology 2009;77(2):270-280.

  32. Aarons GA, Fettes DL, Flores LE Jr, Sommerfeld DH. Evidence-based practice implementation and staff emotional exhaustion in children’s services. Behaviour Research and Therapy 2009;47(11):954-960.

  33. Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychological Services 2006;3(1):61-72.

  34. Paulsell D, Boller K, Hallgren K, Esposito AM. Assessing home visit quality: Dosage, content, and relationships. Zero To Three 2010;30(6):16-21.

  35. Nikki A, Xue Y, Bandel E, Vogel CA, Boller K. Measuring Up: Assessing the Quality of Early Head Start Home Visiting and Classrooms. OPRE Brief #2015-35. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. 2015.

How to cite this article:

Paulsell D. Replicating and Scaling Up Evidence-Based Home Visiting Programs: The Role of Implementation Research. In: Tremblay RE, Boivin M, Peters RDeV, eds. Spiker D, Gaylor E, topic eds. Encyclopedia on Early Childhood Development [online]. https://www.child-encyclopedia.com/home-visiting/according-experts/replicating-and-scaling-evidence-based-home-visiting-programs-role. Updated: January 2022. Accessed April 20, 2024.

Text copied to the clipboard ✓