Featured Post

Fix, Don’t Discard MCAS/PARCC

This fall I had one on one conversations with many of our state's leaders and experts on the misplaced opposition to testing in gen...

Tuesday, November 24, 2015

Fix, Don’t Discard MCAS/PARCC



This fall I had one on one conversations with many of our state's leaders and experts on the misplaced opposition to testing in general caused by some real legitimate mistakes in current public policy. Below is my analysis in the form of an argument.

Whereas:

  1. As enacted in 1993, the Massachusetts Comprehensive Assessment System (MCAS) was initially intended to support a comprehensive portfolio of state and local integrated assessment types. Whether the Board picks Measured Progress or a PARCC-developed test to serve as the G3-8 + G10 state summative assessment, the basic structure and name of MCAS remains the same.  PARCC is a more modern, computer delivered assessment. PARCC items are more challenging and College and Career aligned than MCAS incorporating the higher order thinking skills testing opponents purport to value.

  1. There is a growing opposition to state summative assessments such as MCAS/PARCC due in part to the fact that the current structure fails to meet dual expectations for accountability and instruction.  In addition, many parents and teachers correctly object to the loss in instructional time and unfair use of assessment results to put down schools and districts serving students in poverty.

  1. With that said, State summative assessments such as MCAS and PARCC are essential to school and district accountability; focus school resources on student proficiency and growth; and establish the imperative for state involvement in districts like Lawrence and Holyoke. Elizabeth Warren, the Mass Business Alliance for Education, and the Rennie Center all agree that the MCAS-accountability system must be improved, not discarded.

  1. Urban educators and families correctly object to the use of status data instead of growth to hold schools and districts accountable.  CPI and other metrics that consider status (scaled score or % proficient) unfairly bias public opinion and housing patterns away from schools that serve students in poverty, although those schools are frequently better in terms of SGP than schools serving fewer students in poverty.  Professor Jack Schneider of Holy Cross School of Education has written about the negative impact this has on the society and the importance of utilizing multiple factors before drawing valid insights from the data. Damian Betebenner, the originator of SGP echoes this approach.

  1. While SGP is valid and essential in school and district accountability it is mostly a distraction in teacher accountability.  Fewer than 1 in 5 teachers can have SGP calculated for their grade and subject, the SGP data is very “noisy” at that low N, and it fails to account for the systemic approach (after school tutors etc) that are critical, but out of the direct control of the teacher.

  1. MCAS does not deliver timely nor instructionally significant results.  Spring testing does little to prepare teachers for fall instruction.  Summer mobility and cognitive regression means that the students in front of a teacher are not well measured by last spring’s exam.  The item distribution is insufficient to create skill-level profiles for students, but much more than needed through sampling for school and district level results.

  1. The overreaction to MCAS have resulted in an extended test administration that typically runs for 3 weeks to administer a test that should take 1-2 days, disrupting instruction and schools go into “lock down” mentality.  Some schools overreact in test preparation as well, although the vast majority of that work is exactly what the students most vulnerable students need.

Therefore:  Part of the solution is to split MCAS in two distinct components.  Whatever direction the state takes with regards to PARCC, the policies surrounding assessment need to acknowledge that the same test cannot practically be used for both accountability and instruction.  The Legislature should direct DESE to implement a greatly abbreviated spring assessment used exclusively to generate SGP for school and district accountability and a fall, locally-administered, state-coordinated assessment designed to produce skill-profiles for each student to inform instruction.

Students change over the summer.  New kids enroll.  Many kids regress, some accelerate.  Differentiated instruction and personalized learning requires a detailed understanding of each learner's mastery of generally accepted skills.  The current MCAS and proposed PARCC assessment design does not produce skill-level reports, nor is the information timely, nor complete for the students assigned to each teacher.  Teachers need and families deserve educational support for the “zone of proximal learning” not just the middle of the group.  Embedded diagnostic assessment tools like Kahn Academy, 10 Marks, IXL, ALEKS, DreamBox, etc provide teachers with the information they need to differentiate instruction without endless additional grading and disrupting students from time on task learning.

It is time for the Commonwealth to lead again.  The current debate between a growing extremists movement from both the left and right and a moderate middle that wants to retain common sense high and specific expectations for all students is a destructive waste of energy.  We are better than that.  Again and again Massachusetts has led the Nation with education innovation.  We need to fix MCAS, reduce testing time, stop penalizing urban districts for serving poor students, and focus on giving teachers and families the tools they need to ensure every child reads by 3rd grade, every middle school graduate is competent in Algebra and proficient in writing, and every high school student graduates with the core STEM, ELA and life skills they need to ready for higher education and careers.  Let’s end the destructive debate and get to work.

No comments:

Post a Comment