Pub. 11 2021-2022 Issue 2

8 During my last year teaching in a large, low-income, district-run, turnaround school, I had 26 students in my sixth-grade class. Of those students, how many could read on grade level on the first day of school? Only two. What percentage of them could perform basic multiplication problems? Only 50%. In fact, a full 25% couldn’t perform a subtraction problem if it required borrowing. This data affected me profoundly, and I knew something had to be done. So, I did what every good teacher would do: I Googled for answers. It wasn’t long before I stumbled on a large charter network in New Jersey that was using data to drive their instruction. As a result, despite living in some of the most impoverished neighborhoods in Newark, their students were not just performing above the state average; they were performing at a higher level than the most affluent districts in the state! I knew my students were every bit as capable as those students, so I began writing what they called “interim assessments.” The idea was that if I could write a rigorous interim assessment designed to measure how well students mastered six to eight weeks’ worth of content, then those assessments would give me a road map to where our students needed to go. I pulled in the other nine sixth-grade teachers at my school, and we got to work. These assessments immediately changed the lessons that were being taught on a daily basis. Why? Because everyone knew what the expectation was now. This is because standards tell us what to teach, but they don’t define the rigor. Assessments can do that. Suddenly, we were all teaching much harder content, but that’s only half of what was good about this idea. Improving Student Learning Through Data-Driven Instruction BY ANTHONY SUDWEEKS, CO-EXECUTIVE DIRECTOR AND DIRECTOR OF ACADEMICS, WALLACE STEGNER ACADEMY

RkJQdWJsaXNoZXIy MTU2Mjk4Mw==