Featured Post

Fix, Don’t Discard MCAS/PARCC

This fall I had one on one conversations with many of our state's leaders and experts on the misplaced opposition to testing in gen...

Showing posts with label Assessment. Show all posts
Showing posts with label Assessment. Show all posts

Wednesday, February 3, 2016

Fordham Institute ESSA Accountability Design Competition Finalists

Fordham Institute ESSA Accountability Design Competition Finalists

Tuesday, December 15, 2015

Competency-Based Progression

Competency-Based Progression
Sanborn Regional High School in Kingston, NH

1.       We believe that all students can and must learn. In each of our courses, our competencies include explicit, measurable, transferable learning objectives that empower our students. They address both the application and creation of knowledge and the development of work study practices.

2.       We believe that all teachers must approach grading in the same manner. Grades represent what students learn, not what they earn. We use a four-point letter rubric scale to report both assignment and competency levels of achievement. Numerical “grades” are used only to report final overall course grades so we can compute class rank and GPA for college application purposes. We do not mix academic grades with behavior grades.

3.       We believe that the most significant learning takes place for our students through reflection and reassessment. Our students use the feedback they receive from rubrics to help them understand how to improve their learning.

4.       We believe that our teachers are most effective when they work in teams. We use the Professional Learning Community (PLC) structure to focus our teams on student learning. Over the years, we have found ways to maximize the time allotted for our teachers to collaborate with their PLC’s and this time is available to our teachers every day.

5.       We believe that assessment is meaningful and a positive learning experience for students. Our teachers focus on providing quality aligned instruction and performance assessment practices that are tuned to standards, providing students with multiple opportunities to demonstrate mastery.

6.       We believe that all students must receive timely, differentiated support based on their individual learning needs. We recognize that this support cannot always be embedded within the instructional time, and therefore our school has developed a structure to provide this support school-wide at a dedicated time each school day.

7.       We believe that there are many ways for our students to demonstrate mastery of competencies and thus earn credit for their graduation requirements. At our school, we have expanded credit-bearing opportunities far beyond simple traditional classroom courses. Through these alternative pathways, we have started to create a system whereby our students can advance upon demonstrated mastery.

8.       We believe that competency education is rigorous. Rigor is not defined by how much work we assign our students. It is defined by how deeply we engage them in their thinking, understanding, application, and extension of the skills and concepts presented to them through their coursework. We tune our instruction and assessment to the work of Hess’s Rigor Matrix.

9.       We believe that our school’s competency education philosophy aligns perfectly with the competency based systems that colleges and universities are moving to. To that end, we believe that a competency education model is the best way to prepare our students for college and career.

10.   We believe that competency education is ultimately transformed not by the way we report grades or how we build assessments but rather by how we approach instruction in the classroom. Our classroom teachers recognize that quality instruction engages all learners each and every day.


This article was written originally for Competency Works

How My Understanding of Competency Based Education Has Changed Over the Years
by Brian Stack • December 14, 2015 • 0 Comments

Each day as I interact with our teachers and our students, I am reminded to what extent our decision to move to a competency based model has positively influenced our school’s culture and climate, and our philosophy about learning. Today we are graduating students who have never known any other educational system than the one I described above. We spend a great deal of time with our new staff each fall indoctrinating them with our beliefs about teaching and learning. Each day I see small victories from our work that range from students who are being held to higher standards to teacher teams who continue to advance their own understanding and application of the competency education philosophy. I challenge you to ask any of my teachers if they could ever go back to a traditional mindset and I can assure you that you won’t find one who would. We have truly transformed our professional culture into one focused on student learning

Next week, I am excited to be sharing the work that my team and I have done in New Hampshire on competency based education with a group of South Carolina educators as part of the Transform SC institute on Meeting the Needs of Every Student With Competency Based Progression. My preparation for this institute has been an opportunity for me to reflect on what has now been a six year journey with competency education with Sanborn Regional High School in Kingston, NH. This past week, our school district was recognized for the second year in a row as a “leader in competency education” by Tom Vander Ark’s organization Getting Smart, noting that Sanborn was one of 30 School Districts Worth Visiting in 2015.

Throughout my journey as a building principal navigating the uncharted waters of a new competency education model, I have shared my thoughts, my reflections, and my research through articles on Competency Works. It has been three years since I wrote one of my first articles entitled Five Things That Changed At My School When We Adopted Competencies. I am often asked how my views of competency education have evolved during my tenure at Sanborn. In that 2012 article, I talked about how our school community decided to “jump into the deep end of the pool” of high school redesign in an effort to provide a better learning experience for our students with a new competency based education model. I noted some big changes for our school community that, at the time, was in its second year of implementation of a competency education model that was adopted by our entire K-12 district. We were a school who was still very much in transition from an old way of thinking to a new one. We were leveraging our grading and reporting structures to ultimately help us transform instruction at the classroom model. Over the years, our understanding of competency education has deepened. We continue to learn more about ourselves each day through our work with our students and each other as professionals. When visitors come to our school and talk with our teachers and our students, here is what they often tell me they take away from their visit.




Tuesday, November 24, 2015

Fix, Don’t Discard MCAS/PARCC



This fall I had one on one conversations with many of our state's leaders and experts on the misplaced opposition to testing in general caused by some real legitimate mistakes in current public policy. Below is my analysis in the form of an argument.

Whereas:

  1. As enacted in 1993, the Massachusetts Comprehensive Assessment System (MCAS) was initially intended to support a comprehensive portfolio of state and local integrated assessment types. Whether the Board picks Measured Progress or a PARCC-developed test to serve as the G3-8 + G10 state summative assessment, the basic structure and name of MCAS remains the same.  PARCC is a more modern, computer delivered assessment. PARCC items are more challenging and College and Career aligned than MCAS incorporating the higher order thinking skills testing opponents purport to value.

  1. There is a growing opposition to state summative assessments such as MCAS/PARCC due in part to the fact that the current structure fails to meet dual expectations for accountability and instruction.  In addition, many parents and teachers correctly object to the loss in instructional time and unfair use of assessment results to put down schools and districts serving students in poverty.

  1. With that said, State summative assessments such as MCAS and PARCC are essential to school and district accountability; focus school resources on student proficiency and growth; and establish the imperative for state involvement in districts like Lawrence and Holyoke. Elizabeth Warren, the Mass Business Alliance for Education, and the Rennie Center all agree that the MCAS-accountability system must be improved, not discarded.

  1. Urban educators and families correctly object to the use of status data instead of growth to hold schools and districts accountable.  CPI and other metrics that consider status (scaled score or % proficient) unfairly bias public opinion and housing patterns away from schools that serve students in poverty, although those schools are frequently better in terms of SGP than schools serving fewer students in poverty.  Professor Jack Schneider of Holy Cross School of Education has written about the negative impact this has on the society and the importance of utilizing multiple factors before drawing valid insights from the data. Damian Betebenner, the originator of SGP echoes this approach.

  1. While SGP is valid and essential in school and district accountability it is mostly a distraction in teacher accountability.  Fewer than 1 in 5 teachers can have SGP calculated for their grade and subject, the SGP data is very “noisy” at that low N, and it fails to account for the systemic approach (after school tutors etc) that are critical, but out of the direct control of the teacher.

  1. MCAS does not deliver timely nor instructionally significant results.  Spring testing does little to prepare teachers for fall instruction.  Summer mobility and cognitive regression means that the students in front of a teacher are not well measured by last spring’s exam.  The item distribution is insufficient to create skill-level profiles for students, but much more than needed through sampling for school and district level results.

  1. The overreaction to MCAS have resulted in an extended test administration that typically runs for 3 weeks to administer a test that should take 1-2 days, disrupting instruction and schools go into “lock down” mentality.  Some schools overreact in test preparation as well, although the vast majority of that work is exactly what the students most vulnerable students need.

Therefore:  Part of the solution is to split MCAS in two distinct components.  Whatever direction the state takes with regards to PARCC, the policies surrounding assessment need to acknowledge that the same test cannot practically be used for both accountability and instruction.  The Legislature should direct DESE to implement a greatly abbreviated spring assessment used exclusively to generate SGP for school and district accountability and a fall, locally-administered, state-coordinated assessment designed to produce skill-profiles for each student to inform instruction.

Students change over the summer.  New kids enroll.  Many kids regress, some accelerate.  Differentiated instruction and personalized learning requires a detailed understanding of each learner's mastery of generally accepted skills.  The current MCAS and proposed PARCC assessment design does not produce skill-level reports, nor is the information timely, nor complete for the students assigned to each teacher.  Teachers need and families deserve educational support for the “zone of proximal learning” not just the middle of the group.  Embedded diagnostic assessment tools like Kahn Academy, 10 Marks, IXL, ALEKS, DreamBox, etc provide teachers with the information they need to differentiate instruction without endless additional grading and disrupting students from time on task learning.

It is time for the Commonwealth to lead again.  The current debate between a growing extremists movement from both the left and right and a moderate middle that wants to retain common sense high and specific expectations for all students is a destructive waste of energy.  We are better than that.  Again and again Massachusetts has led the Nation with education innovation.  We need to fix MCAS, reduce testing time, stop penalizing urban districts for serving poor students, and focus on giving teachers and families the tools they need to ensure every child reads by 3rd grade, every middle school graduate is competent in Algebra and proficient in writing, and every high school student graduates with the core STEM, ELA and life skills they need to ready for higher education and careers.  Let’s end the destructive debate and get to work.

Sunday, June 14, 2015

This is what I am looking for

Now imagine what this might look like in practice. Students come to school and learn through a variety of face-to-face and online activities. As they learn, they are given opportunities to practice and demonstrate their learning and receive feedback on an ongoing basis. When they complete learning activities that require them to use basic factual or procedural knowledge, software evaluates their performance and provides immediate feedback. When they complete learning activities that require deeper levels of understanding, analysis, and critical thinking, the learning platform captures their performance (in video, audio, written, or other formats) and immediately sends it to expert graders who score their work and provide feedback to help the students improveme. Then, as students progress through the platform’s learning activities, the results from both the machine-graded and human-graded standardized assessment items are incorporated to create a complete and robust picture of the students’ mastery of learning standards.

From Thomas Arnett June 12, 2015 post on The key to rigorous online assessments at Christensen Institute

- See more at: http://www.christenseninstitute.org/the-key-to-rigorous-online-assessments/#sthash.fGgg3fIb.dpuf

Wednesday, January 7, 2015

Ways to Improve Assessments


The Council of Chief State School Officers 
and the country's largest school districts have spoken out in favor of reducing the number of standardized tests students take. The national teachers unions and other traditionally Democratic groups are on board with the idea too.
Education Secretary Arne Duncan says he is concerned about testing too, but he has written he "strongly believes" in annual tests as an educational tool.
Missing from this debate, however, is a sense of what could replace annual tests. What would the nation do to monitor learning and ensure equity and accountability if states didn't have to test every child every year?

More On Testing

The Test
The Test
Why Our Schools Are Obsessed With Standardized Testing, but You Don't Have to Be
Hardcover, 272 pagespurchase
Here are four possible answers. They're not necessarily mutually exclusive. In fact, they could all happen at the same time, as different states and districts make different decisions.
1) Sampling. A simple approach. The same tests, just fewer of 'em. Accountability could be achieved at the district level by administering traditional standardized tests to a statistically representative sampling of students, rather than to every student every year.
That's how the "Nation's Report Card" works. Formally known as the National Assessment of Educational Progress, or NAEP, it's one of the longest-running and most trusted tests in the U.S. education arsenal, even though it's not attached to high stakes. It's given to a different sample of students each year, ingrades 4, 8 and 12. The widely respected international testPISA is given to a sample of students too.
2) Stealth assessment. Similar math and reading data, but collected differently.
The major textbook publishers, plus companies like Dreambox, Scholastic and the nonprofit Khan Academy, all sell software for students to practice math and English. These programs register every single answer a student gives.
The companies that develop this software argue that it presents the opportunity to eliminate the time, cost and anxiety of "stop and test" in favor of passively collecting data on students' knowledge over a semester, year or entire school career. Valerie Shute, a professor at Florida State University and former principal research scientist at ETS, coined the term "stealth assessment" to describe this approach.
Stealth assessment doesn't just show which skills a student has mastered at a given moment. The pattern of answers potentially offers insights into how quickly students learn, how diligent they are and other big-picture factors.
"Invisible, integrated assessment, to me, is the future," Kimberly O'Malley, the senior vice president of school research at Pearson Education, told me. "We can monitor students' learning day to day in a digital scenario. Ultimately, if we're successful, the need for, and the activity of, stopping and testing will go away in many cases."
Applying this approach on a national scale using scientific methods has never been done, in part because the products are still new. It would probably require a large outlay in terms of software, professional training and computer equipment — and would result in a corresponding windfall for companies like Pearson.
3) Multiple measures. Incorporate more, and different, kinds of data on student progress and school performance into accountability measures.
Statewide longitudinal data systems now track students in most states from pre-K all the way through high school (and in some states, college). That means accountability measures and interventions don't have to depend on the outcome of just one test. They could take a big-data approach, combining information from a number of different sources — graduation rates, discipline outcomes, demographic information, teacher-created assessments and, eventually, workforce outcomes. This information, in turn, could be used to gauge the performance of students, schools and teachers over time.
As part of a multiple-measures approach, some districts are also collecting different kinds of information about students.
3a) Social and emotional skills surveys. Research shows that at least half of long-term chances of success are determined by nonacademic qualities like grit, perseverance and curiosity. As states expand access to pre-K, they are including social and emotional measures in their definitions of "high quality" preschool. As one component of a multiple-measures system, all schools could be held accountable for cultivating this half of the picture.
The Montgomery County Public Schools in Maryland survey both students and teachers on social and emotional factors and use the results to guide internal decision-making. The district uses the Gallup student poll, a 20-question survey that seeks to measure levels of hope, engagement and well-being.
"Engagement" is basically a measure of how excited students are to be in the building. Last year, 875,000 students took the Gallup poll nationwide, in grades 5-12. According to one study, student hope scores on this poll do a better job of predicting college persistence and GPA than do high school GPA, SATs or ACT scores.
3b) Game-based assessments.
Video-game-like assessments, such as those created by GlassLab and the AAA lab at Stanford, are designed to get at higher-order thinking skills. These games are designed to test things like systems thinking or the ability to take feedback — measures that traditional tests don't get at. Of course, they are still in their infancy.
3c) Performance or portfolio-based assessments.
Schools around the country are incorporating direct demonstrations of student learning into their assessment programs. These include projects, individual and group presentations, reports and papers and portfolios of work collected over time. The New York Performance Standards Consortium consists of 28 schools, grades 6-12, throughout New York State that rely on these teacher-created assessments to the exclusion of standardized tests. These public schools tend to show higher graduation rates and better college-retention rates, while serving a population similar to that of other urban schools.
4) Inspections.
Scotland is a place where you can see many of the approaches above in action. Unlike the rest of the U.K., it has no specifically government-mandated school tests. Schools do administer a sampling survey of math and literacy, and there is a series of high-school-exit/college-entrance exams that are high stakes for students. But national education policy emphasizes a wide range of approaches to assessment, including presentations, performances and reports. These are designed to measure higher-order skills like creativity, students' well-being and technological literacy as well as traditional academics. Schools and teachers have a lot of control over the methods of evaluation.
At the school level, Scotland maintains accountability through a system of government inspections that has been in place in the U.K. since 1833. Inspectors observe lessons, look at student work and interview both students and staff members.
This piece is adapted in part from The Test: Why Our Schools Are Obsessed With Standardized Testing, But You Don't Have To Be (PublicAffairs, 2015).

Tuesday, October 28, 2014

Parents Stand Up for Stupidity

REJECTING THE TESTS: New York film editor Michael Elliot is adding fuel to the opt-out movement with a short documentary about parents who have chosen to pull their children from standardized testing. The 3 ½ minute video consists solely of close-ups of the parents’ faces as they plead for an end to the stress and pain their kids associate with excessive testing. One by one, the parents announce “We’re out.” The video: http://bit.ly/1t3NrU6

Wednesday, August 20, 2014

Americans overwhelmingly support high-stakes testing

8/20/14 12:01 AM EDT
Americans overwhelmingly support high-stakes testing to determine whether students should advance to the next grade or graduate from high school, according to the annual PDK/Gallup Poll.
The poll, released this morning, found that more than two-thirds of public school parents don’t believe standardized tests help teachers assess their students’ academic performance. Yet both parents and the public at large expressed very strong support for using tests for certain high-stakes decisions.
The 46th annual PDK/Gallup poll also documented a shift in concerns about public schools. A decade ago, nearly a quarter of Americans thought that discipline issues — drug use, gangs, fighting and the like — were among the biggest problems facing schools in their community. Today, just 9 percent name discipline as a major concern. But 32 percent believe lack of financial support is a big problem for schools. That’s up from 21 percent who cited finances as a top issue in 2004.
Another notable finding: Support for President Barack Obama’s education policies is sinking. Just 5 percent of respondents gave him an A for his “performance in support of public schools,” down from 11 percent in 2011. And 27 percent flunked him outright — up from 15 percent three years ago.

Public school parents are also less enthusiastic about their local schools. Asked to rate the school their oldest child attends, 67 percent graded it A or B, which sounds good — but it’s a drop from prior years, when anywhere from 71 percent to 77 percent of parents gave their schools a top grade. In the past, barely any respondents gave their local school an F; this year, 5 percent did.

Thursday, June 26, 2014

Parent Survey on Testing Time

PARENTS SAY TESTING A TIME SUCK: A new survey says parents think their kids spend too much time preparing for and taking exams. The annual Schooling in America Survey, released today by the Friedman Foundation for Educational Choice and Braun Research, shows that:

  • 44% of parents think test prep takes too much time
  • 22% of parents say their children don’t spend enough time
  • 30% say they spend the right amount of time. 


More than six in 10 Americans also support vouchers, the survey says, with the most support coming from black parents at 74 percent and Hispanic parents at 72 percent. The Friedman Foundation, a school choice proponent, also noted that support for vouchers grew. In 2012, 56 percent of parents supported vouchers compared to 63 percent this year. The American Enterprise Institute is hosting a talk about the survey starting at 3 p.m. Eastern. The survey: http://bit.ly/1iwSauo

Wednesday, June 25, 2014

Taking on the 'Climate Denier' Anti-PARCC Mob

With all due respect, 14 states still makes PARCC bigger that all previous K-12 summative assessments.   Not DOA.  I would ask everyone to be careful about overstatements and misstatements of facts.  I felt at times at the public forum at the library like I was at a West Virginia meeting of climate deniers discussing global climate change.  There was a looseness with facts that belie the thoughtful discussion about discussion we should be having.

From my perspective, while the decision whether to switch to PARCC a year earlier or to wait a year is a relatively small, technical decision that should be based on technology readiness, speed for return of results, and logistics, there are broader issues which I’d urge people to get involved with.

For example, the district is planning on utilizing EdWin, the state’s new Instructional Improvement System, to deliver interim benchmark assessments across the district next year.  There are other interesting assessment options (more MAPS, The Achievement Network, IXL) This decision will have much greater impact on our kids instruction than MCAS vs PARCC.  I would love to see the School Committee focus fall public forums on the comprehensive topic of assessment and not get overly distracted by a small, less consequential decision.

I guess what I am asking all members of our community, particularly on-line, is to elevate the discussion to our highest aspirations for our kids and schools and avoid the downward spiral of cynicism.

We have a new School Committee and will have a new Superintendent a year from now.  We have an opportunity to build on some very impressive gains in core academic performance, a great staff, safe and attractive buildings and a great mix of kids.  We are poised to become an ideal community for urban education.  Let’s stay focused on our broad goals and topics.

g.

From: somerville-4-schools@yahoogroups.com [mailto:somerville-4-schools@yahoogroups.com] On Behalf Of Frank Kelly kellyfj@gmail.com [somerville-4-schools]
Sent: Wednesday, June 25, 2014 6:56 AM
To: somerville-4-schools@yahoogroups.com
Subject: [somerville-4-schools] Re: School Committee discussion on PARCC and MCAS

 
Of the original 23 states who joined PARCC - 9 have dropped out

Time to admit that PARCC is DOA when less than one third of US states will adopt it.

Time also to admit that top-down Governmental efforts that bundle Arkansas, Mississippi, and Louisiana with Massachusetts into the same bucket are doomed to failure. 

How about a plan that unleashes the creativity and passion of our teachers rather than encasing them in more and more laws?
Sadly our lawmakers cannot legislate that :-\


-Frank

Thursday, June 19, 2014

LA and PARCC

Louisiana Commissioner of Administration Kristy Nichols today said she would suspend the state’s approval of a contract that the education department intended to use to purchase the PARCC Common Core exams and launch an investigation into whether it should have been put out for competitive bid.
The move adds another twist to — and intensifies the political infighting over — Gov. Bobby Jindal’s move today to pull the state out of PARCC and repeal the Common Core.
State Superintendent John White has said that the governor has no legal authority to pull out of PARCC or seek new bids on for standardized testing. He said he planned to proceed with full implementation of Common Core and the PARCC exams.
But tonight Nichols said that White appeared to have no authority to purchase and administer the PARCC tests using an old consulting contract dating to 2003. The contract, with the testing company Data Recognition Services, has been amended multiple times, most recently in 2013, to allow the firm to continue to administer each new annual iteration of the state’s tests. White planned to amend that contract again to administer the PARCC Common Core tests, Nichols said.
But she said that introducing the new type of exam, potentially with different subcontractors, was beyond the scope of the original contract. White’s plan, she said, appeared to “circumvent procurement laws.”
“If that’s the intention,” Nichols said, “… yeah, we’re going to have a problem with it.”
Nichols served as Jindal’s deputy chief of staff before taking on her current role.

Nichols said she would send the state education department a letter this evening demanding that they provide justification for using the existing contract for the new exams. Once she hears back, she said, her office will review the contract and the department’s reasoning and determine whether to allow the contract to proceed or require an entirely new process. In the meantime, she has suspended payment to the company named in the contract.

PARCC Contracting

The PARCC consortium originally planned to have every participating state use the same testing company to write items, administer and score the new Common Core exams — and expected states to adopt the pricing models set forth in a model contract developed by New Mexico. But that vision may have to be overhauled due to the brouhaha over the Common Core in Louisiana, a spokesman for the consortium said tonight.
PARCC let the state of New Mexico request bids from testing companies for several years of Common Core assessments. Pearson was the sole bidder and won that contract. Other states in the consortium were then expected to adopt the same contract — without seeking other bids — for their own Common Core assessments, spokesman David Connerty-Marin said.
But in Louisiana, state Superintendent John White was planning to use an existing contract with Data Recognition Corporation to deliver the Common Core tests, according to his spokesman, Barry Landry. White told POLITICO that the state does not have a contract with PARCC, but planned to use PARCC content nonetheless, apparently through the vehicle of the existing contract with Data Recognition. Earlier tonight, however, the state commissioner of administration put a big crimp in that plan, announcing that it appeared to circumvent state procurement laws. The commissioner, Kristy Nichols, suspended the contract with Data Recognition immediately, pending an investigation.
PARCC had been unaware of White’s plans and was taken by surprise by Nichols’ statement, Connerty-Marin said. But he said the consortium was “committed to working with John White… to figure out a solution that works for Louisiana.”
“The original vision of the PARCC consortium was, and is, based on item security and comparability,” Connerty-Marin said. “The best path forward for those two things is a single administrator for the tests and a single scoring approach” across all participating states, he said.
“However, that said, sometimes politics happens,” Connerty-Marin said. “So if there is another way that will make this work for Louisiana, I think our governing board will certainly look at that.”
PARCC’s original plan was already in jeopardy before the contretemps in Louisiana, because a rival testing company had protested New Mexico’s bidding process, complaining it was rigged in favor of Pearson. A judge has ordered the state purchasing agent to investigate that complaint; both sides are currently filing briefs. There is no timetable for the decision. In the meantime, the contract in New Mexico is on hold and no other state can adopt it.
One more wrinkle in the saga: Pearson’s bid in New Mexico was based on the understanding that all the other PARCC states would adopt the same contract. It priced the tests accordingly.  If states instead pick their own vendors to administer and score PARCC questions, that could upend the economics for Pearson. “It’s just too early to say,” Connerty-Marin said. “We just don’t know the details.”

A contract to administer Common Core tests in all PARCC states would be worth billions of dollars over the next several years.

Thursday, June 12, 2014

My comments on anti-PARCC movement

PARCC exam draws fire from Somerville parents, teachers

By Monica Jimenez
mjimenez@wickedlocal.com
June 12. 2014 8:09AM

Too much testing in Somerville Public Schools is hurting teachers and students and should be reconsidered, according to Somerville teachers and parents.

At a forum on high-stakes standardized testing at the Somerville Public Library June 5, a petition to pause the rollout of the PARCC exam sparked an outpouring of fear, heartbreak and anger from more than 60 teachers, parents and education professionals about the ever-increasing emphasis on standardized testing in public schools, which they said is decreasing the quality of education and putting stress on teachers and students. The forum was organized by three opponents of increased testing – State Sen. Pat Jehlen, Somerville teacher union head Jackie Lawrence and former Cambridge Public Schools kindergarten teacher Susan Sluyter, who has spoken out nationally against the effects of increased testing.

PARCC, which stands for Partnership for Assessment of Readiness for College and Career, is a test developed across states and based on federal Common Core standards. Massachusetts officials have worked on developing the PARCC test and 81,000 students, including ones in Somerville, have taken a pilot version of the exam this year instead of the MCAS test.
Firsthand experience

The proliferation of further testing is the last thing schools need, Somerville teachers and parents said at the meeting.

"In Somerville, kindergarten tests are immediate, right off the bat," said East Somerville Community School third-grade Unidos ELA and math teacher Patrice Hobbs. "As soon as school starts, your child is being tested."

Somerville parent Janine Ell said although her daughter does fine on all the tests, there are too many, and there are always new ones to get used to.

"My bigger issue is about what we want to spend time on in the course of the day. I think we spend too much time on testing," Ell said. "And sometimes I don’t want all the information I get. Let’s make some decisions about what’s valuable and stick to it."

Superintendent Tony Pierantozzi’s upcoming retirement provides a prime opportunity to effect change by influencing the School Committee’s selection of a new superintendent, but in the meantime there’s something else they can do, Ell said.

"The Somerville community should ask the School Committee and the superintendent to pause PARCC for the 2014-2015 school year to allow a community-wide discussion about PARCC," Ell said. "Somerville has not had a chance to thoroughly discuss what it means for the kids."

Boston University professor Bayla Ostrach said she has butted heads with Somerville school administrators, from principal to assistant superintendent to superintendent, over her daughter not wanting to take the MCAS, she said. They have been reluctant to give her information about testing dates and have pressured her to have her daughter take the tests, Ostrach said.

Not only that, but her college students are stymied when she tells them she doesn’t give tests and she expects them to show their learning through speaking, Ostrach said.

"Kids come into the classroom and they’re good at taking tests, but they’re not good at walking in the room and throwing ideas around and thinking critically about it," Ostrach said. "They think I tell them what to know and they regurgitate it on tests."

Parent Brian Duplisea said it’s not right that parents don’t know much about the transition from MCAS to PARCC.

"I think we need to put pressure on the school system and say we demand a dialogue because we are paying for this," Duplisea said. He added, "Our kids are too important."

In defense of testing
However, a few parents and School Committee members spoke in defense of testing. Parent Greg Nadeau was frustrated with what he called the "one-sided" conversation at the forum, saying education is going through a transition and testing helps focus resources on students who are less proficient to achieve equity in schooling.

"It’s a false dichotomy to say because we’re doing DIBELS [early literacy test], we can’t have play in our kindergarten classrooms. It’s ridiculous and not true. We can have play and art. And we can also have data, facts and science. They’re not mutually exclusive," Nadeau said.


He added, "What we should be talking about is how to make this better and implement it in a way that’s better for the whole child."


School Committee member Paul Bockelman said the remarks he heard at the meeting were "disheartening" and the School Committee will be grappling with the MCAS versus the PARCC, but defended testing.
"The idea that tests are bad is ridiculous. Every teacher said they are assessing kids every day. Every industry is becoming more and more about assessing how we’re doing. We want that," Bockelman said.
Still, Bockelman said assessments need to be balanced with teachers being able to use their professional judgment in their classrooms.

"Tests aren’t perfect. Nothing does critical thinking like a teacher sitting with a student," Bockelman.
School Committee member Caroline Normand also asked parents to remember and celebrate what is good about Somerville schools, to speak about what they value so the school system can do more of it.
"We need to have assessment, but useful assessment," Normand said.

Testing through the yearsA few Somerville residents went more in-depth into the bigger-picture effects of too much assessment, and the history of testing.

Somerville resident Nancy Carlsson-Paige, who taught teachers at Lesley University for 35 years and is now involved in education policy, said several factors have contributed to the increasing emphasis on testing, including a genuine desire to close the achievement gap, the increased federal role in public education, and the growing belief that collecting systemized data would lead to successful schools and learners.

However, schools with low test scores have been closed and sometimes replaced with charter schools, which means billions in public tax dollars are moving from the public to the private sector, Carlsson-Paige said. Teachers are vilified and punished for low test scores and are working in a climate of fear, she said.
"What it does when a teacher’s job is on the line and they pay for test scores, mean they teach to the test," Carlsson-Paige said. "They have to abandon the kind of education they love because the stakes are high for teacher survival as well as the school and the child."

Somerville resident Clara Simmons traced the emphasis on standardized testing back to U.S. President Franklin Delano Roosevelt, who agreed to institute testing not because he believed in it, but because Republicans demanded it to hold him accountable for his educational initiatives.
"They are a statistical hoax. If you talk to a well-informed superintendent, they will admit it," Simmons said. "A fourth-grade test isn’t a fourth-grade test. It is a statistical procedure to decide what the average fourth-grader should know."

Use politicsState senator Pat Jehlen, D-Somerville, whose grandchildren attend the Healey School in Somerville, said the intention behind standardized testing is good, but the way it’s being used is not.

"It’s not testing that’s wrong, it’s the interventions that follow, the high stakes for kids’ graduation and moving from grade to grade," Jehlen said. "For teacher it is or will be part of their evaluation. For schools it begins to be important because schools can be labeled low-performing."

It’s unfair because studies have shown student achievement is largely determined by family income, not by the teacher or the school, Jehlen said. Test scores reflect the wrong information, and prioritizing them leads to the wrong outcomes, she said.

"They’re saying you have to turn around your school in three years or fire all the faculty. There are really fast ways to turn around schools and turn around scores," Jehlen said. "Get rid of special ed, get rid of ELL. If you want better test scores, you just have to get kid from better backgrounds, who don’t have family members getting shot at night, who get breakfast even on weekends."
"Let us know you’re being sarcastic, OK?" pleaded Ell.

Rather than focusing on what students should know on a test in high school and working backwards to determine early childhood curriculum, schools should teach the way kids learn, and listen to their preschool teachers, said Jehlen, who urged keeping the conversation going.

"Don’t be afraid to use politics, because that’s what’s going to change what happens here," Jehlen said.