After your district has identified local needs, it is time to determine which evidence-based strategies will best serve your student population. The chart below aligns to ESSA's four evidence categories and outlines the types of research studies completed on Savvas programs.
For a closer look at the four evidence categories in ESSA, view the Evidence-Based Requirements Explained tab above.
ESSA emphasizes "evidence-based" approaches that have demonstrated statistically significant positive effect on student outcomes. ESSA identifies four levels of evidence: strong evidence, moderate evidence, promising evidence, and evidence that demonstrates a rationale. The levels do not correlate to the strength of student outcomes. Rather they define the study criteria.
What does evidence-based mean?
The first step to select an evidence-based intervention is to conduct a local needs assessment to: 1) identify local needs and/or root causes; and 2) assess the local capacity to implement the intervention.
The following questions come from the Department of Education’s Guidance to use when reviewing a program’s evidence.
No. Only Title I Schools identified for either comprehensive or targeted support or improvement must implement at least one intervention that is evidence-based. (E-26 of the January 2017 Title I Accountability FAQ)
However, reviewing a program’s evidence can help schools determine if an intervention is likely to be successful in improving student outcomes with their student population.
No. The requirement to purchase interventions based on strong, moderate, or promising evidence only applies to Section 1003(a) school improvement funding. (E-24 of the January 2017 Title I Accountability FAQ)
Research study results/evidence varies because researchers are asking multiple questions. For example, in addition to "To what extent did student achievement increase?," we also want to know "To what extent was the program implemented well?" and "How can we isolate the effect of the program itself?" Another example question could be "To what extent did student achievement increase for different types of learners?"
In our research reports, we analyze student assessment scores adjusted means. This means we introduce covariates (other factors that might influence a student’s performance) into our analyses. For example, a teacher that implements a program with a high fidelity of implementation (implementing the program the way it was intended) will see higher significant positive gains than a teacher who does not implement the program as designed.
By introducing the covariates we are able to isolate the effect of the program itself and determine if it is having a positive significant effect higher than the control or comparison group. By using this method of analyses (i.e., the use of covariates) these programs are meeting ESSA's requirements of Strong evidence. Other research groups, such as the BEE and WWC, conduct analyses on the unadjusted means (student mean assessment scores) to determine if a program has a positive significant effect greater than the control or comparison group.
At Savvas, we’re committed to ensuring that our products and services deliver positive learner outcomes. Our Research Team conducts formative and summative research that directly informs the development of PreKindergarten through Grade 12 instructional programs this includes third-party validation research. We work in collaboration with educators, students, authors, and developers to apply research-based principles to both product and user experience design. We measure a program's impact through scientific studies and trials to evaluate how well it meets the needs of users of all abilities and achievement levels.
At every stage of a product’s lifecycle—from initial idea to the retirement of a product—we embed efficacy and research activities. These activities help us understand, define, and demonstrate how a product impacts learner outcomes.
We’re rethinking education at every step. Help us design new educational products by participating in a research study. Let’s improve education together. Sign up now.
Join the PreK-12 Research Panel to impact education.
Register your child for this real-life learning experience.