Back to recent texts

Replicating and Scaling Up Evidence-Based Home Visiting Programs: The Role of Implementation Research

Diane Paulsell, MPA

Mathematica Policy Research, USA

September 2012

Introduction

Over the past two decades, a growing number of home visiting programs have been developed and implemented in North America and internationally to support parents with young children. Home visiting programs for families with pregnant women and young children operate in all 50 states in the United States, with an estimated 400,000 to 500,000 families receiving services.1 These programs span a continuum of locally-developed programs, evidence-informed programs (developed based on evidence about best practice, but not evaluated), and evidence-based programs (those with rigorous evaluation evidence of effectiveness).

During the same time period, interest has grown among policy makers, practitioners, and funders in North America, the United Kingdom and elsewhere in promoting the use of practices and interventions with scientific evidence of effectiveness. In the US, the Obama administration has funded a range of initiatives that require the use of evidence-based strategies in areas such as teen pregnancy prevention, home visiting, education and workforce innovation.2,3 In the field of home visiting, an increasing number of programs have been rigorously evaluated and have demonstrated evidence of effectiveness in outcome domains such as parenting, maternal and child health, child development and school readiness, reductions in child maltreatment, and family economic self-sufficiency.4,5,6

Subject

Identifying core components of interventions found to be effective and understanding what it takes to implement those components with fidelity to the program model is critical to successful replication and scale-up of effective programs and practices in different community contexts and populations.7 There is growing recognition in the early childhood field of the importance of effective implementation and the need for implementation research that can guide adoption, initial implementation, and ongoing improvement of early childhood interventions.8,9,10 The promise of implementation research and using data to drive program management is compelling because it offers a potential solution to the problem of persistent gaps in outcomes between at-risk children and their more well-off peers. This article discusses implementation research in the home visiting field, how such research can be used to strengthen programs and improve targeted outcomes, and the conditions and supports necessary for effective implementation.

Problems

Simply adopting an evidence-based home visiting program and meeting the initial start-up requirements of the model developer is not enough to ensure that it will produce the positive effects for children and families found in evaluation research.11 Home visiting services should be implemented with fidelity to the program model. For example, home visitors should have required qualifications, visits should occur at the intended frequency and duration, visit content should be delivered as intended, and the quality of services provided to families should be high. Moreover, service providers need adequate supports and resources to sustain implementation with a high degree of fidelity over time.12

Research Context

While the body of rigorous research on the effectiveness of home visiting programs has grown substantially in recent years, research on implementation lags behind.4 Research reports and articles typically provide only minimal information about how programs are implemented and their fidelity to the program model.8 As national and local governments, communities and service providers seek to scale up the use of evidence-based home visiting programs, research is needed to develop program fidelity standards and measures, understand the conditions necessary for high-fidelity implementation, and create tools to assess implementation and support program improvement.

Key Research Questions

This review is designed to address two questions:

  1. What do we know about fidelity of implementation in evidence-based home visiting programs?
  2. What conditions and resources are necessary to support and sustain high-fidelity implementation over time?

Recent Research Results

What do we know about fidelity of implementation in evidence-based home visiting programs?

Researchers have developed a number of theoretical frameworks that define implementation fidelity.13,14,15 Most include adherence to the program model, dosage, quality, and participants’ responsiveness and engagement in services; some include the quality of participant-provider relationships.

While research on fidelity in home visiting programs is fairly sparse, studies have documented some components, such as dosage and duration of services, home visit content, and participant-provider relationships. Research shows that families typically receive roughly half of the number of home visits expected.16,17 For example, across three randomized controlled trials conducted of Nurse Family Partnership, average dosage of visits ranged from 45 to 62 percent.18 Research also shows that many, perhaps most, families enrolled in home visiting programs drop out before their eligibility ends.16,19,20 Some home visiting studies have varied the dosage that families were offered and found that fewer home visits produced outcomes similar to higher levels of exposure.21

Systematic study of activities and topics discussed during home visits is essential for understanding whether content was delivered as intended and how content varies across families and over time. While most programs provide curriculum guidelines and training for home visitors, research suggests that content is not always delivered as planned and varies across families. For example, multiple studies have found that, despite program objectives that emphasize parenting, little time or emphasis was placed on parent-child interactions.22,23 A recent study of Early Head Start found that, on average, home visitors spent 14 percent of each home visit on activities designed to improve parent-child interactions.24 Fidelity frameworks also emphasize the importance of developing positive participant-home visitor relationships, since these relationships may influence the extent of parent engagement and involvement in home visits.17,25,26 Some research indicates that higher-quality relationships are associated with better outcomes for children.27,28

What conditions and resources are necessary to support and sustain high-fidelity implementation over time?

Best practice and emerging research suggest that home visiting staff need training, supervision and fidelity monitoring, a supportive organizational climate, and mental health supports to sustain high-fidelity implementation over time. The effect of these kinds of supports on home visitors has not been well studied, but some research on similar interventions indicates implementation of evidence-based practices with fidelity monitoring and supportive consultation predicts lower rates of staff turnover, as well as lower levels of staff emotional exhaustion relative to services as usual.29,30,31 Moreover, a supportive organizational climate has been associated with more positive attitudes toward adoption of evidence-based programs.32

Research Gaps

More research is needed to guide decisions about adoption, adaptation and replication, and support scale-up of evidence-based home visiting programs. For example, research is needed to determine the thresholds of dosage and duration of services necessary to positively affect family and child outcomes. Planned variation studies, in which program components, content, home visitor training, or dosage of services is varied, can identify core dimensions of implementation that are critical for achieving program impacts, as well as dimensions that could be adapted for different contexts and populations without threatening the program’s effectiveness.

To facilitate these studies, more work is needed to develop implementation measures. While some measures have been developed – such as observational measures of home visiting quality and scales for assessing the participant-home visitor relationship – their validity and reliability have not been sufficiently tested with different populations and service delivery contexts.33

Conclusions

As interest in the promise of evidence-based home visiting programs to improve outcomes for children and families grows, policymakers and practitioners need guidance about how to implement them effectively and sustain high-fidelity implementation over the long term. While the body of implementation research on home visiting programs is growing, more work is needed. Research shows that most programs do not deliver the full dosage of services intended, and families often drop out of programs before their eligibility ends. Variation also exists in adherence to intended activities and topics covered during home visits. Emerging research points to the importance of supportive supervision, fidelity monitoring, and organizational climate to support home visitors and maintain support for the evidence-based program. Additional research on these topics can provide guidance and tools for promoting successful implementation of evidence-based home visiting and adaptation of program models to different populations and contexts.

Implications for Parents, Services and Policy

Supporting high-fidelity implementation of evidence-based home visiting programs has the potential to improve outcomes for at-risk children and families. Policymakers and funders should use the available research on implementation and encourage future work to guide decisions about how to scale up evidence-based programs effectively and support them over time. For example, implementation research can be used to assess the readiness of local agencies to implement home visiting programs with fidelity. Government and other funders can use implementation research to structure requirements for monitoring and reporting on specific dimensions of implementation. Government and funders at all levels can support these efforts by creating data systems to facilitate fidelity monitoring and use of data for program improvement. Moreover, implementation research can inform staff training and ongoing technical assistance. For parents, the implication is that participation and engagement matter. Parents must understand the goals of the program they are enrolling in and the expectations for taking up and participating in services. To achieve intended dosage, program staff may need to help parents address barriers to their participation.

Researchers should continue building the knowledge base about how to implement home visiting programs effectively by reporting information on implementation alongside results of rigorous effectiveness evaluations. Additional research on the replication and scale-up of home visiting programs should be conducted to identify the conditions, processes, and supports associated with achieving and sustaining high-fidelity implementation.

References

  1. Stoltzfus, E., & Lynch, H. (2009). Home visitation for families with young children. Washington, DC: Congressional Research Service.
  2. Goesling, B. (2011). Building, evaluating, and using an evidence base to inform the DHHS Teen Pregnancy Prevention Program. Paper presented at the Society for Adolescent Health and Medicine Annual Meeting, Seattle, WA.
  3. Haskins, R., & Baron, J. (2011). Building the connection between policy and evidence. London, UK: NESTA.
  4. Paulsell, D., Avellar, S., Sama Miller, E., & Del Grosso, P. (2011). Home visiting evidence of effectiveness: Executive summary. Princeton, NJ: Mathematica Policy Research.
  5. Daro, D. (2006). Home visitation: Assessing progress, managing expectations. Chicago, IL: Chapin Hall at the University of Chicago.
  6. Gomby, D. S. (2005). Home visitation in 2005: Outcomes for children and parents. (Invest in Kids Working Paper No. 7). Washington, DC: Committee on Economic Development.
  7. Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19, 531-540.
  8. Avellar, S., & Paulsell, D. (2011). Lessons learned from the home visiting evidence of effectiveness review. Princeton, NJ: Mathematica Policy Research.
  9. Kaderavek, J. N., & Justice, L. M. (2010). Fidelity: an essential component of evidence-based practice in speech-language pathology. American Journal of Speech-Language Pathology, 19, 369-379.
  10. Paulsell, D., Porter, T., Kirby, G., Boller, K., Sama Martin, E, Burwick, A., Ross, C., & Begnoche, C. (2010). Supporting quality in home-based child care: Initiative design and evaluation options. Princeton, NJ: Mathematica Policy Research.
  11. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327-350.
  12. Coffee-Borden, B., & Paulsell, D. (2010). Recruiting and training home visitors for evidence-based home visiting: Experiences of EBHV grantees. Princeton, NJ: Mathematica Policy Research.
  13. Daro, D. (2010). Replicating evidence-based home visiting models: A framework for assessing fidelity. Princeton, NJ: Mathematica Policy Research.
  14. Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balian, S. (2007). A conceptual framework for implementation fidelity. Implementation Science, 2,40.
  15. Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2010). Putting the pieces together: An integrated model of program implementation. Prevention Science, 12, 23-33.
  16. Kitzman HJ. Effective early childhood development programs for low-income families: Home visiting interventions during pregnancy and early childhood. In: Tremblay RE, Barr RG, Peters RDeV, eds. Encyclopedia on Early Childhood Development [online]. Montreal, Quebec: Centre of Excellence for Early Childhood Development; 2004:1-7. Available at:  http://www.childencyclopedia.com/documents/KitzmanANGxp-Home.pdf. Accessed July 30. 2012.
  17. Riley, S., Brady, A. E., Goldberg, J., Jacobs, F., & Easterbrooks, M. A. (2008). Once the door closes: Understanding the parent-provider relationship. Children and Youth Services Review, 30, 597-612.
  18. Personal communication from Dr. David Olds to Dr. Kimberly Boller, January 25, 2012.
  19. Love, J. M., Kisker, E. E., Ross, C. M., Schochet, P. Z., Brooks-Gunn, J., Paulsell, D., Brady-Smith, C. (2002). Making a difference in the lives of infants and toddlers and their families: The impacts of Early Head Start. Princeton, NJ: Mathematica Policy Research.
  20. Duggan, A., Windham, A., McFarlane, E., Fuddy, L., Rohde, C., Buchbinder, S., & Sia, C. (2000). Hawaii’s healthy start program of home visiting for at-risk families: Evaluation of family identification, family engagement, and service delivery. Pediatrics. 105, 250-259.
  21. DePanfilis, D., & Dubowtiz, H. (2005). Family connections: A program for preventing child neglect. Child Maltreatment, 10, 108-123.
  22. Peterson, C. A., Luze, G. J., Eshbaugh, E. M., Jeon, H. J., & Kantz, K. R. (2007). Enhancing parent-child interactions through home visiting: Promising practice or unfulfilled promise? Journal of Early Intervention, 29, 199-140.
  23. Hebbeler, K. M., & Gerlach-Downie, S. G. (2002). Inside the black box of home visiting: A qualitative analysis of why intended outcomes were not achieved. Early Childhood Research Quarterly, 17, 28-51.
  24. Vogel, C. A., Boller, K., Xue, Y., Blair, R., Aikens, N., Burwick, A., Stein, J. (2011). Learning as we go: A first snapshot of Early Head Start programs, staff, families, and children (OPRE Report #2011-7). Washington, DC: Department of Health and Human Services.
  25. Korfmacher, J., Green, B., Spellmann, M., & Thornburg, K. R. (2007). The helping relationship and program participation in early childhood home visiting. Infant Mental Health Journal, 28, 459-480.
  26. Korfmacher, J., Green, B., Staerkel, F., Peterson, C., Cook, G., Roggman, L., . . . Schiffman, R. (2008). Parent involvement in early childhood home visiting. Child Youth Care Forum, 37(4), 171-196.
  27. Peterson, C. A., Roggman, L. A., Stearkel, F., Cook, G., Jeon, H. J., & Thornburg, K. (2006). Understanding the dimensions of family involvement in home-based Early Head Start. Unpublished manuscript. Iowa State University, Ames, Iowa.
  28. Roggman, L. A., Christiansen, K., Cook, G. A., Jump, V. K., Boyce, L. K., & Peterson, C. A. (2006) Home visits: Measuring how they work. Logan, UT: Early Intervention Research Institute Mini-Conference.
  29. Aarons, G. A., & Palinkas, I. A. (2007). Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34, 411-419.
  30. Aarons, G. A., Sommerfeld, D., Hecht, D., Silovsky, J., & Chaffin, M. (2009). The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology, 77, 270-280.
  31. Aarons, G. A., Fettes, D. L., Flores, L. E., Jr., & Sommerfled, D. (2009). Evidence-based practice implementation and staff emotional exhaustion in children’s services. Behaviour Research and Therapy, 47, 954-960.
  32. Aarons, G. A., & Sawitzky, A. C., (2006). Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychological Services, 3, 61-72.
  33. Paulsell, D., Boller, K, Hallgren, K., & Esposito, A. M. (2010). Assessing home visit quality: Dosage, content, and relationships. Zero To Three, 30, 16-21.

How to cite this article:

Paulsell D. Replicating and Scaling Up Evidence-Based Home Visiting Programs: The Role of Implementation Research. In: Tremblay RE, Boivin M, Peters RDeV, eds. Spiker D, Gaylor E, topic eds. Encyclopedia on Early Childhood Development [online]. http://www.child-encyclopedia.com/home-visiting/according-experts/replicating-and-scaling-evidence-based-home-visiting-programs-role. Published September 2012. Accessed February 26, 2020.