By Dennis A. Johnston, Ph.D., Director of Data and Evaluation, AVID Center
College readiness can be a challenge to measure but at AVID Center, we take on this task every year through our certification process. The 11 AVID Essentials provide schools and districts with a barometer to measure their success in providing a college readiness environment.
As AVID sites mature and progress on the certification continuum, they have the opportunity to apply to become AVID Demonstration sites, undergoing a rigorous application process and cooperating with the AVID Center on a site visit. Over the past several years, I’ve had the opportunity to visit numerous AVID sites in pursuit of becoming a, “National Demonstration School.” An AVID National Demonstration School is one representing a high quality AVID Elective program and all the best that AVID brings to students, faculty, school culture, and community relations. Demonstration sites offer those considering AVID an opportunity to see what the program looks like in real classrooms with real kids. Sites striving to obtain Demonstration status undergo an extensive application and document review process culminating in a site visit by a team of program experts. This is done to ensure that these programs are implemented with a high level of fidelity to the AVID model. In other words, to be sure their AVID is our AVID.
Program implementation fidelity has been characterized as "…the degree to which teachers and other program providers implement programs as intended by the program developers” (Dusenbury, Brannigan, Falco, & Hansen, 2003). Researchers have found that program implementation fidelity affects participant outcomes such that higher levels of fidelity produce more positive outcomes (Thomas, Baker & Lorezetti, 2007; Noel, 2006). So what does this mean in the context of AVID? It suggests that when AVID is implemented consistent with the way it is trained at Summer Institute, student outcomes will be maximized and site-level transformation around college-readiness can begin.
So how does one determine whether their AVID program is really AVID or if they themselves are simply avid about AVID? The Certification Self-Study (CSS) is a tool designed to measure the implementation fidelity of the AVID program. It focuses on the 11 Essentials (core components of the AVID program) and contains numerous implementation indicators measured across a four-point rating scale; 0 – Not AVID to 3 – Institutionalization. Annually, sites rate themselves on each of the indicators resulting in scores on each of the Essentials that is then combined to produce an overall certification score. A score of 1 on each of the Essentials indicates an adequate level of implementation fidelity rendering a site “Certified”. A “Certified” site therefore, is one that is considered to have implemented AVID consistent with the AVID model and philosophy. Sites with scores of 2 or 3 on the Essentials are considered to have higher implementation fidelity and greater depth to their programs and are generally touching more students and faculty on their campus.
I’m sometimes asked, “Why do I have to complete the CSS each year if my students are successful and the program continues to grow?” Students succeed for a variety of reasons and programs grow out of need more than anything else. When determining the effectiveness of any program it is important that participant results can be attributed back to the program and that program growth is done with quality so as to maintain high levels of success. Research has shown that this is much more likely to occur when programs are implemented as intended using clear metrics or measurements of implementation fidelity – in this case, the CSS.
A recent study (Johnston, Nickel, Popp, and Marcus, 2010) assessed the validity or usefulness of the CSS in determining the implementation fidelity of AVID programs across the country. Based on previous research, it was hypothesized that AVID sites identified as having the highest levels of fidelity would produce higher student outcomes than sites with lower fidelity. Data from 2,655 sites were obtained using the CSS during the 2008-09 academic year. Sites were classified into one of three groups based on CSS ratings; Not AVID, AVID Certified, or Demonstration. Results indicated that schools implementing AVID at the highest levels of fidelity (Demonstration) produced significantly higher student outcomes across all academic and course enrollment measures. Similarly, “Certified” sites outperformed sites considered “Not AVID” across all academic and course enrollment measures further supporting the notion that higher implementation fidelity results in higher levels of program success. A brief synopsis of key performance indicators shows the degree to which AVID Demonstration sites outperform more basic sites. It was concluded that the CSS is a valid measure of program fidelity and more importantly, that as programs go more deeply in their implementation they can expect student outcomes to significantly increase.
This study, coupled with an extensive body of literature, should promote the practice of monitoring the implementation fidelity of educational programs. Certifying AVID sites based on the extent to which their program reflects the AVID model is much more about pedagogy than protocol. The CSS is designed to be prescriptive in assisting sites to move more deeply in their work and thus, increase their program fidelity and ultimately the success of their students. Asking oneself, “How AVID is our AVID?” is a great place to start.
Dusenbury L., Brannigan R., Falco M., Hansen W. (2003). A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Education Research, Vol. 18, No. 2, 237-256.
Johnston, D., Nickel, P., Popp, J., Marcus, M (2010). Validation of the AVID Certification Self Study (CSS): A measure of AVID secondary program implementation fidelity. AVID Center.
Thomas RE, Baker PRA, Lorenzetti D. (2007). Family-based programmes for preventing smoking by children and adolescents. Cochrane Database of Systematic Reviews, Issue 1. Art. No.: CD004493. DOI: 10.1002/14651858.CD004493.pub2.