Among the many conundrums in children’s services that prevention science is struggling to overcome is this one: relatively few programs have passed any robust test of their effectiveness and those that have are even less likely to be implemented than ones that can claim no empirical support at all.Mulling over this disconcerting paradox, Brian Bumbarger and Daniel Perkins from the US Prevention Research Center at Penn State University have come to the conclusion that there is an urgent need for “interventions to enhance readiness for interventions”.The wider problem of how to integrate programs that have demonstrated efficacy into mainstream policy and practice is an aspect of the bigger challenge facing what is being called Type 2 translation research.In particular, Bumbarger and Perkins consider the obstacles hindering implementation of a growing number of programs have been shown to be effective at reducing drug use and delinquency among young people.“First, these evidence-based programs are still under-utilized compared to prevention strategies with no empirical support. Second, when effective programs are used the evidence suggests they are not being implemented with quality and fidelity. Third, effective programs are often initiated with short-term grant funding, creating a challenge for sustainability beyond seed funding.”In offering solutions, they draw on their experience of a large-scale dissemination effort in Pennsylvania. Since 1998, the Pennsylvania Commission on Crime and Delinquency (PCCD) has invested over $60 million in approximately 140 replications of various evidence-based interventions, from elementary school social-emotional learning programs to community mentoring and universal “family strengthening”.Among their recommendations are, first, that policy makers should consider the question: “Is this program being planted in fertile ground?” In the PCCD initiative, few sites had thoughtfully considered this central issue of readiness. It led to start-up problems, such as poor buy-in from schools and teachers for classroom-based programs, difficulty hiring staff, under-budgeting for training and struggles to recruiting the target population.It is therefore necessary, they argue, to address the challenges that practitioners face when implementing interventions – for example by taking the time to identify community needs and to assess the stability of the implementing agency’s infrastructure and the skills and support of administrators and implementers. Hence the need, they say, for “interventions to enhance readiness for interventions”.Secondly, Bumbarger and Perkins point to a need to support communities and schools through targeted training and continuous support to promote “higher quality and sustainability”. Research demonstrates unequivocally, they argue, that implementation quality affects outcomes. Yet in a 2007 survey of all the Pennsylvania sites funded since 1998, nearly 40% reported they had not implemented the program as designed. Of these, 55% had made adaptations that could reasonably be expected to reduce program effectiveness (by undermining the programme’s theory of change). Examples included deleting lessons because of limited classroom time, and changing the target population in response to difficulty in recruiting families.Bumbarger and Perkins contend that if we know that absolute fidelity never occurs in natural conditions, arguing for it is futile. [See: You’re going to be unfaithful, so why not make it part of the service?] Instead, they advocate “adaptation with fidelity”. Pennsylvania therefore provides targeted technical assistance to each new grantee through the Prevention Research Center. “The conventional approach of ‘train them and send them on their way’ is ineffective in promoting high-quality implementation because there is generally little or no follow-up after training to assist implementers in navigating barriers. Effective program adoption and implementation require initial training that is interactive and engaging, provides opportunities for behavioral rehearsal, and is followed up with ongoing coaching, technical assistance and support.”Third, they say that “it is important to develop and pursue a concrete sustainability plan early on that goes beyond simply seeking the next temporary funding source”. Often there is funding to initiate programs but it is left to the grantee figure out how to sustain them. Consequently, many grant-funded prevention programs fizzle out after the initial funding period. This is misleading as well as unhelpful: during the start-up phase a program may not be functioning at a level to generate positive outcomes; nipping potentially successful programs in the bud is likely to erode community trust and willingness to be innovative.Strategies implemented in Pennsylvania appear to have been effective at promoting sustainability for the PCCD-funded programs. In a 2007 survey of ones that had been “off” grant funding for two or more years, 76% reported they were still operating. A strategy of escalating “local match” funding over four years (100% in years one and two, 25% in three and 50% in four) encouraged communities to identify sustainability funding sources early in the implementation process and also to provide a gradual reduction in grant support to provide ample time to put the sustainability plan in place. Plans should typically include attention to organization infrastructure capacity and nurturing “champions” inside or outside.