Posted at 21:45h
in
Blog
by Don Soifer
February 10, 2019
Opportunity Scholarship Students Post Important Academic GrowthEarlier this month, the Nevada Department of Education (NDE) published its first report assessing academic outcomes of students in the Nevada Opportunity Scholarship Program. The analysis showed encouraging results: 68.4 percent of students participating in the program for all three of its years were determined to show positive score change; 65.7 percent of students in the program for two consecutive students also posted outcomes deemed positive by the NDE analysts.
It is important to observe that these results were measured on six different assessments different students were administered under the rules of the program. The law (AB165) which established the Opportunity Scholarship program allowed participating private schools to select the test they used, and the dates it is given, from a list of norm-referenced assessments approved by the NDE.
Some lawmakers, and particularly Moises (Mo) Denis, chair of Nevada’s Senate Education Committee, have on numerous occasions expressed their intent to review student outcomes for the program as part of their oversight process.
So, how do these results compare with test scores for Nevada’s public school students? The answer, unfortunately, is complicated.
One major complicating factor is that Smarter Balanced assessment which public school students in Nevada are administered, is a criterion-referenced assessment, in which a student’s performance is compared to a specific learning objective or achievement standard (e.g., state standards) and not the performance of other students. On the other hand, norm-referenced assessments (the type used in this program), compare students’ performance with that of a larger sample (the norm group), frequently a national sample representing a representatively diverse cross-section of students.
Can results on these different tests be compared meaningfully? Yes, when done using certain, valid methodologies. Outcomes on the different norm-referenced assessments can be usefully compared, especially when examining the growth of individual students over time, as the NDE report does.
In similar ways, it is also even possible to consider student growth of these students with the growth of Nevada’s public school students on the state assessments. For sake of comparison, between 2017 and 2018, Nevada’s public school students in grades 3-8 gained one percentage point of proficiency in English Language arts, and 1.5 percentage points in grades 3-7 in math (excluding eighth grade because of curricular content changes in math which produced test score anomalies).
Of course, the two student populations here contain various essential differences: only lower-income students are eligible to participate in the Opportunity Scholarship Program, the NDE study only included students enrolled in the same school for consecutive years, etc.
It is quite common among the nation’s top charter school oversight authorities to evaluate the effectiveness of schools in their purview using students’ longitudinal growth on these same norm-referenced assessments. This practice allows them to evaluate these schools’ performance with students over the time they are enrolled without penalizing schools that go out of their way to attract those previously underserved students who need them most.
These same norm-referenced assessments also hold important educational value to schools which use the results for individual students formatively, to help classroom teachers target specific instructional interventions and lessons. Some Nevada private schools give the assessments as many as three times each year for this purpose.
The NDE report met the legislature’s requirement for the program. In future years, if policymakers want to strengthen the evaluation, they would be well served to consider doing what other states, including Florida, have done, and assign modest funding to allow the state’s colleges of education to bid for the quantitative analysis work. In the hands of experts like these, the norming and standardization scales used be each test published can be evaluated with methodology allowing for broader, more robust, and easier-compared analyses.
Meanwhile, educators and families involved with the Opportunity Scholarship Program have good reason to express pride in these student outcomes, and hope that policymakers will accept them as strong indications of its educational value for the students it serves.