Wednesday, November 14, 2018

Beyond Heat Maps: Data Analysis is More than "They Failed"

Data analysis must show us the CAUSES of student misunderstandings and lead us to instructional decisions about HOW we can respond. Heat maps are only the first layer of analysis. The cut points on the heat map are correlated to passing standards. How does knowing they failed tell us what the kids don't understand and what part we can play in making that better?  The cut points have nothing to do with WHY the scores landed there or WHAT we should do to fix that. I don't wanna rant about all that in this post. If you want me to explain, come have coffee. I like coffee.

We have to go beyond the heat map.

Because what we really need to see is what the data analysis should show us about what we are going to DO when there are 30 faces looking at us in the classroom. 

Because what we really need from data is something practical that puts feet to our prayers and rubber to the road. 

First, statistical analysis over multiple items and years, trend data, can tell us what level of the gradual release model that needs to be addressed to correct the misconceptions. In the data I analyzed for a district about F19B, the data indicated that additional work was needed in Quadrant One (Modeling, Thinking Aloud, Direct Instruction/Delivery) and Quadrant Two (Shared and Interactive processing of the content AND the processes/steps used to complete the tasks). Here's the lesson we used to help teachers understand the nuances between each phase as applied to ELAR texts and instruction. 

Second, statistical analysis can also tell us what kind of instructional strategies we should employ because of where the kids are in the learning process. Fisher, Frey, and Hattie talk about how important it is to choose the right instructional strategy for when and where the students are in the learning process: Surface, Deep, and Transfer. Basically, if you are using the wrong strategy at the wrong time, the data will show it. You can read about that here: Visible Learning for Literacy. Someday, when I have more time, I'll explain how you can use the data to point to which strategy level should be used. 

Third, item analysis patterns and trends over multiple items help us name the cognitive gaps in reasoning, content, alignment to curriculum and assessment, mismatches in materials, and even test taking processes. When you give a NAME to the thinking error, you can design a response. When we looked more deeply at the item analysis (the spread of answer choices) for F19B items, we realized some important issues about our daily instruction that we could change and make a huge difference - and quickly.

We analyzed three items, y'all. Three. But what we learned changed everything about how we were going about our work. We found simple, clear and articulated understandings about what we needed to change about what we were reading and how we were reading it. The analysis showed us what content we needed to cover, what processes we needed to teach, and what reasoning and thinking lessons we had left untaught. 

And no one complained that no one understands ELAR. No one blamed the SPED kids. No one argued that the questions were mean or tricky. No one made excuses. Because they could see what the problems were AND they didn't feel helpless to find a solution. Data analysis that day was more than "they failed" because we knew why and what to do next. 


No comments:

Post a Comment