• By Dartington SRU
  • Posted on Wednesday 07th July, 2010

A problem as big as Alaska? The solution will take a minute

Fidelity: the quality of being faithful. It’s a critical word when it comes to evidence-based programs. It doesn’t matter how good your program is if service providers are unfaithful to the program when it is rolled out into the community. Measuring fidelity is difficult enough in a tightly controlled trial. It is more difficult still when operating out in the real world. Put that program in Alaska and you’ve entered a whole new universe of problems. It started when Knowlton Johnson and his colleagues from the Pacific Institute for Research and Evaluation in Alaska realized that drug use was exploding among school-aged children in their state.They wanted to do something about it but where to start? To begin with, there were no proven models designed for the Alaskan way of life. Johnson’s team works in what are known as ‘Frontier Communities’, few of which can be reached by road (which, unfortunately, had not been a deterrent to drug abuse). The next problem was that supposing they could find an effective prevention program, how would they monitor its implementation in each hard-to-reach locale?Part of the solution arrived in the form of Think Smart, a school-based program based on Gil Botvin’s Life Skills Training or LST. Johnson’s team made a series of adaptations to match Think Smart to Alaska. They incorporated local images, removed passing references to gambling—a particular problem in Alaska—and included facts and figures about Alaskan life.The next question was implementation. They had to come up with a system to see if teachers taught the Think Smart curriculum as planned. The answer came through a video feed and the easy availability of on-line cameras. Johnson and his team video recorded teachers while they delivered the program. They measured dosage and adherence, counting the number of times teachers changed, added or omitted parts of the prevention curriculum. They also estimated teacher’s skills in delivery, tracking, for example, the amount of praise they gave to students.In an article in the current edition of Prevention Science, Johnson and his team describe the way they assembled an expert panel to assess the video recordings. Their experts worked independently of each other. They tested the reliability of the raters, making sure their individual conclusions were consistent with each other. In this way, they came up with a statewide way to track how well their prevention program was doing. Their Alaskan experience may provide a good road map for others who hope to track fidelity of programs in large systems.ReferencesKnowlton Johnson and colleagues, ‘Studying implementation quality of aschool based prevention curriculum in frontier Alaska: Application ofvideo-recorded observations and expert panel judgment’, PreventionScience.

Back to Archives