NC Science End-Of-Grade Results Suggest Manipulation; Inflated School Performance Grades

 

bubble Test

Wilmington, NC ”“ In early September, The North Carolina Department of Public Instruction (NC DPI) released the state’s test scores for the 2015 -2016 school year. The state’s end-of-grade (EOG) and end-of-course (EOC) proficiency scores have remained stubbornly low despite entering its fifth year of using the Common Core State Standards.

NC DPI and the State Board of Education (SBOE) have been under fire since the 2010 adoption of the Common Core State Standards to improve statewide student achievement under the new standards. NC DPI insists that Common Core is the goose that lays golden eggs, yet all it seems to have produced thus far is failure. Even with marginal gains in math and reading this year, almost half of the students in the state still don’t meet basic proficiency levels four years after the standards were implemented.

When testing data is released, education leaders tend to focus on math and reading scores. Thanks to Carol Kramer, a savvy parent in New Hanover County, this year’s science EOGs are getting some well-deserved scrutiny.

It is well-known in academia that students cannot perform satisfactorily on a science EOG without strong reading and math skills. Historically speaking, the higher the math and reading proficiency, the higher the science score. When Ms. Kramer brought to my attention the fact that a low-performing middle school in my district had reported a 64.7% science proficiency while reporting 13.4% math and 26.9% reading scores, I was stunned. Even under the most ideal circumstances, it would be nearly impossible to produce this kind of result for an entire 8th grade class.

Let me rephrase that.  If the science standards are as academically challenging as math and reading, it is not possible.

Upon further analysis, almost every New Hanover County school reported higher 5th and 8th grade science EOG scores than math and reading. Although the gaps varied depending on the school, the most pronounced differences occurred in some of the lowest performing schools. The students with the worst math and reading proficiency scores had some of the highest science to math and reading ratios.

I turned my attention to the statewide data to determine if this was a district reporting error but I found the same anomalies existed in every district I examined. Even the North Carolina’s aggregated scores reflect this irregularity, which I’ve broken down as follows:

NORTH CAROLINA 2015 ”“ 2016 AGGREGATED SCORES BY SUBJECT

Subject

5th Grade

Level 3*

5th Grade

CCR**

8th Grade

Level 3

8th Grade

CCR

Science

71.6% 61.8% 73.9%

64.5%

Math

60.4% 54.0% 44.7%

38.5%

Reading

55.4% 43.1% 53.4%

41.5%

*Level 3 is minimum proficiency and above. **College and Career Ready Proficiency includes students who scored a level 4 or 5.

Note the double-digit differences – in some cases as much as 20+ percentage points – between the science EOG scores and math and reading results. 8th grade is particularly troublesome.

It’s hard to believe that barely half of the 8th grade students in North Carolina are considered proficient (Level 3) in math and reading, yet almost 74% of them passed a science EOG.

This begs the question: Is testing data being manipulated to make failure look like success?

When situations like these occur, there can be a misalignment of the test with the standards that are used to create the curriculum. However, in 2014, NC DPI commissioned the Wisconsin Center for Education Research to perform a standards alignment analysis on the state’s math, reading and science EOG tests. The firm concluded its analysis in September, 2015, and submitted its report to NC DPI. It was determined that the tests are properly aligned to the standards.

According to NC DPI, the state’s science standards are rigorous and up-to-date. In fact, in the 2009 ”“ 2010 school year, the SBOE announced plans to completely overhaul the state’s K-12 academic standards. North Carolina’s science standards fall under the NC Essential Standards.

Upon the June, 2010 adoption of the Common Core math and reading standards, state superintendent, June Atkinson, stated the following in a press release: “North Carolina’s own essential standards are well aligned with the math and English Common Core, and we look forward to the benefits these new standards will bring for our students.”

In a 2012 follow-up press release entitled New Common Core and Essential Standards Align Teaching and Learning to Career and College Readiness, NC DPI proudly touts the alignment of Common Core and the NC Essential Standards, stating that, As North Carolina public school students return to school for the 2012-13 year, teachers are preparing new lessons and new teaching strategies to match new teaching and learning standards in every grade and every subject. This is the first time that North Carolina has implemented a completely revised Standard Course of Study in all areas and grades at once.”

Assuming the science standards match the rigor and 21st century critical thinking demands that the Common Core math and reading standards promised, they should be more challenging than what we had previously. Had any substantive changes been made to the science standards since 2012, NC DPI would most likely have suspended testing, or at the very least, withheld the data as it would not have been reliable.

After combing through an absurd amount of information, I found that the explanation must lie in the tests themselves ”“ either the cut scores aren’t weighted accurately, the test questions aren’t demanding enough, or both.

Cut scores are the point at which an education agency determines the cutoff points for proficiency levels. The State Board of Education approved a new proficiency scale (from a 4 to a 5 point scale) and achievement levels for EOG/EOC assessments in October, 2013.

At that time, the SBOE was given four test cut score options along with the projected impact each option would have on student proficiency. Several of the options were weighted to produce higher proficiency rates. The SBOE approved “Option 1”, the most balanced approach that also projected realistic proficiency achievement gains, which were NC DPI’s recommendations for the state EOG/EOC assessments.

It is not known if NC DPI has made changes to the science cut scores since then, but the data below shows that the science proficiency rates are far outpacing their initial expectations, while math and reading appear to be on target.

When the science EOGs are compared to the math and reading scores over the past three years, you can more clearly see the disparities. In conventional terms, if there are gains in math and reading, there should be comparable improvements in science, as there is a distinct relationship between them. This is clearly not the case.

GRADE LEVEL PROFICIENCY (LEVEL 3 AND ABOVE)

2013/2014

2014/2015 2015/2016 2013/2014 2014/2015 2015/2016

5th Grade

8th Grade

Science

64.2% 64.6% 71.6% Science 71.4% 72.6% 73.9%

Math

56.4% 57.5% 60.4% Math 42.2% 43.2%

44.7%

Reading 53.8% 53.0% 55.4% Reading 54.2% 53.4%

53.4%

Variance*

7.8/10.4 7.1/11.6 11.2/16.2 Variance 29.2/17.2 29.4/19.2

29.2/20.5

*Variance is the point spread between science and math/ reading percentages. It is calculated by subtracting each math and reading proficiency percentage from the correlating science percentage. For example, 7.8/10.4 is the result of 64.2 – 56.4 = 7.8 and 64.2 ”“ 53.8 = 10.4

COLLEGE AND CAREER READY PROFICIENCY (LEVEL 4 AND 5)

2013/2014

2014/2015 2015/2016

2013/2014

2014/2015

2015/2016
5th Grade 8th Grade

Science

52.6% 54.1% 61.8% Science 61.9% 63.7%

64.5%

Math

50.3% 51.3% 54.0% Math 34.6% 35.8%

38.5%

Reading

40.3% 42.2% 43.1% Reading 42.3% 41.6%

41.5%

Variance

2.3/12.3 2.8/11.9 7.8/18.7 Variance 27.3/19.6 27.9/22.1

26.0/23.0

*Variance is the point spread between science and math/ reading percentages. It is calculated by subtracting each math and reading proficiency score from the correlating science score. For example, 2.3/12.3 is the result of 52.6 ”“ 50.3 = 2.3 and 52.6 ”“ 40.3 = 12.3

Even though gains in math and reading are marginally ticking upward – with the exception of 8th grade reading, where students have lost ground – significant gains in science proficiency scores have opened up sizeable gaps each year.

This completely flies in face of the gold standard of assessment reporting, the National Assessment of Educational Progress (NAEP) data, which reported the 2011 8th grade minimum proficiency scores as follows: Math = 37%, Reading = 31%, Science = 26%.

Just prior to the SBOE’s adoption of the new proficiency scale and achievement level cut scores, the North Carolina General Assembly passed accountability measures in an effort to hold the state’s public schools responsible for student achievement. Under General Statue 115C-83.15, each school would now be graded on an A-F scale based on test scores and value added-growth measurements.

The timing here is interesting.

In the table below taken from the 2015 ”“ 2016 Performance and Growth of North Carolina Public Schools Executive Summary, NC DPI is smugly displaying how well North Carolina’s public schools are doing based on the number of schools moving into the higher performance levels during this same three-year period.

screen-shot-2016-09-29-at-11-52-19-pm

Since the 2013/2014 school year:

  • 84 schools are now categorized as having an A+ rating
  • 46 fewer schools are categorized as having an A rating
  • 52 additional schools have earned a B rating
  • 81 additional schools have earned a C rating
  • 92 fewer schools are categorized as having a D rating
  • 44 fewer schools are categorized as having an F rating

It must be noted that this data does include public charter schools and that additional schools have come online in this timeframe. The A+ rating was also not an available performance measure in 2013. Regardless, this is pretty significant movement in a 3-year period.

Large upward swings in math and reading scores would be difficult to justify because that is where most of the scrutiny tends to be focused. The science proficiency scores, however, have largely slipped under the radar.

It would be advantageous for NC DPI to have the highest proficiency rates possible due to the way school achievement scores are calculated. According to general statute, one point for each percent of students who score at or above proficiency is allotted for each indicator measured.

This could explain the gaps between science and math/reading proficiency scores.

screen-shot-2016-09-30-at-8-19-15-pm

Several of the other indicators used in school achievement score calculations above are quite subjective and don’t provide an accurate picture of school performance, but they do artificially inflate school achievement scores.

The 4-year graduation rate, which is currently 85.8%, is a prime example. Most reasonable people would have difficulty squaring an overall proficiency score of 58% with an 85% graduation rate (Related story: The Actual Graduation Rate in NC isn’t 85.8%).

Math III “course rigor” is another data point that is problematic as it is based on the number of students who complete the course with a passing grade ”“ in other words, it is not based on a standardized test score. For the last three years this percentage has been reported by NC DPI as greater than 95%.

There are other EOCs that are subject to NC DPI’s discretionary cutoff levels that also factor into the school achievement score, which is now beginning to fall apart under closer inspection.

The North Carolina Department of Public Instruction is staffed with highly educated, well-paid professionals whose job is to collect, analyze and present this information to the local school districts they oversee and to the public. They also use this data to help the SBOE make important decisions regarding state education policy. It’s hard to believe they failed to recognize this kind of breakdown in their own testing accountability program.

North Carolina’s teachers rely heavily on testing information to tailor instruction. They must be able to trust the information coming from their superiors or there is no point in wasting their time collecting it.

But the effect of low expectations on students is much more disturbing. Intentionally lowering the bar to make failure look like success ultimately hurts the most disadvantaged among us, particularly those in high-poverty, low-performing schools.

If our education leaders are using their positions to manipulate test results in order to circumvent efforts to hold them accountable, can any of the data they release be trusted?

Facebook Comments