Featured Post

Fix, Don’t Discard MCAS/PARCC

This fall I had one on one conversations with many of our state's leaders and experts on the misplaced opposition to testing in gen...

Tuesday, November 24, 2015

Fix, Don’t Discard MCAS/PARCC



This fall I had one on one conversations with many of our state's leaders and experts on the misplaced opposition to testing in general caused by some real legitimate mistakes in current public policy. Below is my analysis in the form of an argument.

Whereas:

  1. As enacted in 1993, the Massachusetts Comprehensive Assessment System (MCAS) was initially intended to support a comprehensive portfolio of state and local integrated assessment types. Whether the Board picks Measured Progress or a PARCC-developed test to serve as the G3-8 + G10 state summative assessment, the basic structure and name of MCAS remains the same.  PARCC is a more modern, computer delivered assessment. PARCC items are more challenging and College and Career aligned than MCAS incorporating the higher order thinking skills testing opponents purport to value.

  1. There is a growing opposition to state summative assessments such as MCAS/PARCC due in part to the fact that the current structure fails to meet dual expectations for accountability and instruction.  In addition, many parents and teachers correctly object to the loss in instructional time and unfair use of assessment results to put down schools and districts serving students in poverty.

  1. With that said, State summative assessments such as MCAS and PARCC are essential to school and district accountability; focus school resources on student proficiency and growth; and establish the imperative for state involvement in districts like Lawrence and Holyoke. Elizabeth Warren, the Mass Business Alliance for Education, and the Rennie Center all agree that the MCAS-accountability system must be improved, not discarded.

  1. Urban educators and families correctly object to the use of status data instead of growth to hold schools and districts accountable.  CPI and other metrics that consider status (scaled score or % proficient) unfairly bias public opinion and housing patterns away from schools that serve students in poverty, although those schools are frequently better in terms of SGP than schools serving fewer students in poverty.  Professor Jack Schneider of Holy Cross School of Education has written about the negative impact this has on the society and the importance of utilizing multiple factors before drawing valid insights from the data. Damian Betebenner, the originator of SGP echoes this approach.

  1. While SGP is valid and essential in school and district accountability it is mostly a distraction in teacher accountability.  Fewer than 1 in 5 teachers can have SGP calculated for their grade and subject, the SGP data is very “noisy” at that low N, and it fails to account for the systemic approach (after school tutors etc) that are critical, but out of the direct control of the teacher.

  1. MCAS does not deliver timely nor instructionally significant results.  Spring testing does little to prepare teachers for fall instruction.  Summer mobility and cognitive regression means that the students in front of a teacher are not well measured by last spring’s exam.  The item distribution is insufficient to create skill-level profiles for students, but much more than needed through sampling for school and district level results.

  1. The overreaction to MCAS have resulted in an extended test administration that typically runs for 3 weeks to administer a test that should take 1-2 days, disrupting instruction and schools go into “lock down” mentality.  Some schools overreact in test preparation as well, although the vast majority of that work is exactly what the students most vulnerable students need.

Therefore:  Part of the solution is to split MCAS in two distinct components.  Whatever direction the state takes with regards to PARCC, the policies surrounding assessment need to acknowledge that the same test cannot practically be used for both accountability and instruction.  The Legislature should direct DESE to implement a greatly abbreviated spring assessment used exclusively to generate SGP for school and district accountability and a fall, locally-administered, state-coordinated assessment designed to produce skill-profiles for each student to inform instruction.

Students change over the summer.  New kids enroll.  Many kids regress, some accelerate.  Differentiated instruction and personalized learning requires a detailed understanding of each learner's mastery of generally accepted skills.  The current MCAS and proposed PARCC assessment design does not produce skill-level reports, nor is the information timely, nor complete for the students assigned to each teacher.  Teachers need and families deserve educational support for the “zone of proximal learning” not just the middle of the group.  Embedded diagnostic assessment tools like Kahn Academy, 10 Marks, IXL, ALEKS, DreamBox, etc provide teachers with the information they need to differentiate instruction without endless additional grading and disrupting students from time on task learning.

It is time for the Commonwealth to lead again.  The current debate between a growing extremists movement from both the left and right and a moderate middle that wants to retain common sense high and specific expectations for all students is a destructive waste of energy.  We are better than that.  Again and again Massachusetts has led the Nation with education innovation.  We need to fix MCAS, reduce testing time, stop penalizing urban districts for serving poor students, and focus on giving teachers and families the tools they need to ensure every child reads by 3rd grade, every middle school graduate is competent in Algebra and proficient in writing, and every high school student graduates with the core STEM, ELA and life skills they need to ready for higher education and careers.  Let’s end the destructive debate and get to work.

Thursday, November 12, 2015

AERA Growth for Teacher Eval

The American Educational Research Association is warning schools against using value-added scores when they make high-stakes decisions about teachers.

Value-added scores aim to measure the impact a teacher or teacher preparation program has on student achievement. But from a research perspective, it's very difficult to successfully isolate teachers and their training programs from the myriad other factors that play into how students perform on tests, AERA said in a new policy statement.

In fact, the conditions needed to make VAM scores accurate can't be met in many cases, according to the statement.


"This statement draws on the leading testing, statistical, and methodological expertise in the field of education research and related sciences, and on the highest standards that guide education research and its applications in policy and practice," said AERA Executive Director Felice J. Levine. 

Monday, November 2, 2015

Houston Badges

So You Want to Drive Instruction With Digital Badges? Start With the Teachers

204 Shares
Terry GrierTerry Grier 
Oct 31, 2015
You can’t have a conversation about the future of public education these days without some mention of digital learning. And when you talk about digital learning, the discussion often turns to badging.

The concept is simple: individuals earn badges for demonstrating the acquisition of key knowledge and skills. Think Girl Scouts. When you marry the concept of badging with technology, you get digital badges that allow a person’s portfolio of badges to be stored in one place and provide a record of subject or skill mastery. This could have a significant impact on awarding credentials or certificates to students, and perhaps even creating an implementation framework for competency-based learning.

This could have a significant impact on awarding credentials or certificates to students, and perhaps even creating an implementation framework for competency-based learning.
While badging for students shows real promise, a partnership between the Houston Independent School District (HISD) and VIF International Educationdemonstrates that in the short run the best approach to scaling digital badging is not to focus on students, but on their teachers.

Beginning this past fall, HISD launched a global learning initiative in 28 elementary schools. The district will expand the program to a total of 51 elementary schools for the 2015-16 school year. To ensure program quality for our students, we partnered with VIF to provide our teachers with globally themed online professional development and a customized digital badging system. Within the professional development platform, they also have access to curricular resources and a community of fellow educators to spur and support collaborative projects and innovative approaches. But the core of the system is the badging approach to professional development.

Participating teachers advance through a series of inquiry-based professional development modules. Teachers are awarded a digital badge for the successful completion of each 10-hour module. To accomplish this, they must complete the following steps: 1) study module content, 2) participate in a focused discussion with peers working on the same module, 3) create an original inquiry-based global lesson plan that incorporates new learning, 4) implement the original lesson plan in the classroom, 5) provide evidence of classroom implementation and 6) reflect on and revise the lesson created.

The final product of every module is a tested, global lesson plan that articulates learning objectives, activities, assessments, and resources for each stage of inquiry. Upon completion, teachers may publish finalized lessons in a resource library where they can be accessed by other educators. As designed, the HISD badging system will be a four-year, 16-badge approach that equates to 160 hours of professional learning for teachers.

Like other web-based professional development, the HISD badging system provides flexibility for HISD teachers to access the modules online at any time and place and to complete them at their own pace. This flexibility is critical to help teachers balance their everyday demands with the expectation to build new expertise in content, pedagogy and new technologies.

What makes the digital badging system different from more traditional forms of professional development are five key features that taken together increase significantly the likelihood that the learning experience for a teacher will lead to results in the classroom for students — which, after all, is the point of professional development. The five features:

Badging requires demonstrating understanding and implementation of a target content or skill. To complete a module successfully requires more than just moving through the content. Teachers must learn it; confer with peers; develop, implement and show evidence of a lesson plan using it; and reflect on the experience.
Badging provides recognition and motivation. Badges represent both tangible and public symbols of both demonstrated learning, as well as the knowledge and skills that a teacher has yet to develop. They create a recognizable pathway to demonstrating proficiency that teachers can understand and own.
Badging allows for knowledge circulation among teachers. By requiring the development of lesson plans and evidence of implementation, digital badging systems create instructional materials that teachers can share and build from with each other. Digital badges accumulate in a teacher’s online profile, can be shared via social media, and acknowledged by schools, districts and states.
Badging can be tracked and assessed. The HISD system provides tailored reports on the progress of teachers through the badging process. This function allows principals and district instructional support personnel to not only track the completion of badges and review developed materials, but to assess the impact of the modules on teacher and student learning.
Badging is a scalable enterprise. Once the modules and overall pathways are set, teachers can be added at whatever scale the district wants. The online platform scales to whatever number of teachers the district seeks to involve.
That portfolio is portable. It remains with them whether they remain in the same school, move to another school within HISD, or to another district altogether.
For teachers, digital badges could have use value beyond their work in HISD. It allows them to build a badging portfolio that reflects the skills and knowledge they have developed, as well as evidence of classroom impact. That portfolio is portable. It remains with them whether they remain in the same school, move to another school within HISD, or to another district altogether.

For school and district leaders, the badging system creates a platform for at least two future endeavors. First, personalizing professional development pathways with modules and badges reflect an individual teacher’s learning needs. Second, it develops a career advancement system based on demonstrated expertise through badging.

The HISD-VIF digital badging system for teachers offers a professional development experience that teachers have been seeking: one that is flexible, job-embedded, and collaborative, and provides actionable strategies for use in the classroom. It is like wheels on luggage. You are left wondering why it took so long to put this system in place. 

Dr. Terry Grier is the superintendent of the Houston Independent School District.