Three Reasons Why The UO Recommended Benchmark Goals Are Necessary

Questions & Answers

Q1: What external measure and percentile do the DIBELS Next Recommended Goals correspond to?

The external criterion measure for the DIBELS Next Recommended Benchmark Goals for all grades is the Stanford Achievement Test—10th edition (SAT10; Pearson Education, Inc., 2004, 2007 Normative Update) Total Reading Composite score, administered at the end of the school year. The benchmark goal is the DIBELS score that most closely predicts scoring at or above the 40th percentile on the SAT10. Students with scores in the strategic range are predicted to score between the 20th and the 39th percentile on the SAT10. Students with scores in the intensive range are predicted to score below the 20th percentile on the SAT10.

Q2: How do you recommend we measure comprehension?

Oral Reading Fluency is a good indicator of both fluency and comprehension and is sufficient for assessing most students. We do not endorse using Retell Fluency for all students, but it can be used with any student for whom you would like additional information regarding their knowledge of the passages they read. In grades 3 through 6, we endorse the Daze measure for comprehension.

Q3: If the sample of students used to determine the former goals was higher achieving, shouldn’t the former goals have been higher than the recommended goals?

Not necessarily, but the answer depends on the method used to choose the goals. If only percentile ranks from the former goals were used to select the benchmark goals then, yes, the former goals may have been higher than the recommended goals. So, for example, if one chose to place the goal at the 40th percentile for students in each sample, then the higher achieving sample would produce a higher goal.

The analysis used to produce the former goals, however, relied primarily on the negative predictive value (NPV), as well as several other design specifications. The NPV is markedly affected by the prevalence of reading difficulty in the sample. If we consider a sample of students with a low prevalence of reading disabilities, like in the former goals analysis, we would expect NPV to be higher. This would indicate that a "good" goal would correspond to a relatively low DIBELS score (see UO-CTL (2012) Technical Brief Part II, pp. 5-6, for more information).

Furthermore, it is not at all clear how the other design specifications were incorporated into the selection of the former goals. One of the biggest issues with the former goals analysis is that the goals were not linked to a common standard—they were linked to performance on subsequent DIBELS Next measures or to the DIBELS Next composite score. Hence, the value of former goals was determined in part by their performance relative to a subsequent DIBELS Next administration (e.g., fall goals were determined by winter goals) rather than a criterion measure. This approach creates an artificial dependence between the screener and the criterion, and can make the goals appear spuriously more valid than they otherwise might be. This procedure also creates unpredictable goals.

The recommended benchmark goals relied on sensitivity and specificity values as the primary statistics to establish benchmark goals. These statistics do not depend on whether the sample included high- or low-performing students. Sensitivity and specificity depend only on the overall accuracy of the screener as it relates to a criterion. Nonetheless, a diverse and representative sample of students is valuable for other reasons. The selection of the recommended benchmark goals, consistent with standards for test development, relied on a representative sample from diverse communities in all nine census regions of the U.S. to ensure that students from many socio-economic and ethnic backgrounds are represented.

Q4: How can I look at growth when I used the former goals last year and I am using the recommended goals this year?

You can look at growth in several ways. If you are using the DIBELS Data System, you can run your 2011-2012 reports using the recommended goals and compare them to your 2012-2013 reports. You can also look at the raw scores from year to year to see the growth for individual students or look at changes in group-average scores for students in a school or district to look at school or classroom level growth. It is important to remember that an increase in the goals does not mean there is a decrease in student performance.

Q5: Do the recommended goals use the same DIBELS Next materials as the former goals?

Yes, there is currently one set of DIBELS Next assessment materials that is used with both sets of goals. Note that administration of some of the DIBELS Next measures (e.g., Retell) is optional when using the recommended goals.

Q6: Will the DIBELS Data System continue to provide reports with the former goals next year?

Yes, we plan to continue support of the former goals for the foreseeable future. We think it is valuable for schools to have access to both sets of goals so that they can see how each set of goals function when making decisions about their own students.

Although we do suggest transitioning to the recommended goals, we understand the need to support both sets of goals and will continue to provide you with that option.

Q7: Can I see how the recommended goals predict my state outcome measure?

Yes. If you are using the DIBELS Data System you can enter scores from a state-level or other outcome measure of your choice and see how well the DIBELS Next recommended goals predict performance on the outcome measure for your district or school. You can also run a report showing you how well the former goals predict performance on the same outcome measure. This ability provides you with valuable information that is specific to your population. Additionally, you can participate with us as a Sentinel School, and we will partner with you to conduct this data analysis.

Please contact us at for more information.