Friday, October 26, 2018

Author's Purpose Isn't about Naming It: Implications for STAAR Assessment and Instruction

My friend Sarah and I plan a day to collaborate each month to plan the staff development she will conduct at her campus. As we looked at the TEKS Resource System YAG and IFD, we realized that we still had many questions about how we needed to approach the work. What could we do to help teachers understand the approach the needed to take with Expository (3rd grade), Informational (4-11th grades) and Literary Nonfiction (12th grade)? We started with 5th grade - not sure why. What we found and the process we discovered was powerful learning, y'all. (Note that we are also working to integrate the NEW Standards with our current work.) 

Well. Who cares...

Let me get to the point. STAAR addresses questions for Informational texts in FOUR different ways that have important implications for rigorous, aligned instruction and performance assessments. And I don't think we are teaching it that way.


  1. Over the 6 year history of STAAR, 5.10A has been assessed 18 times - averaging 2-3 questions on each administration. (Note that in all 18 cases, the standard has been assessed ACROSS multiple texts.) 5.10A has been dual coded with Figure 19 2 times. 5.13B has been assessed 2 times. I used the lead4ward IQ tool to find the questions and identify the frequency. 
    1. So what does that mean? When you are looking at your YAG for Informational text, it would behoove you to make sure that you are planning instruction correctly for such a highly tested standard. 
    2. And because of the nature of this standard, it also has the potential to help students create a deeper comprehension of the entire text that should enhance performance on other standards.
  2. Performance assessments do not necessarily include the type of thinking required for success on STAAR. If the work we are asking students to do does NOT include these nuances, then we are not really teaching the standard. 
    1. We must revise our performance assessment to include the reasoning processes reflected in how this standard is interpreted and assessed by TEA.
    2. We must create text SETS that allow students to experience this work across multiple texts. 
    3. In planning for our new standards - we must connect this work beyond reading a text for comprehension and toward reading texts like writers to examine our own intentions/purposes and craft choices that deliver those to the reader. 
Here's an example of how a performance assessment does not include the right level of rigor and what changes might need to be made. 

There are FOUR CATEGORIES that define how this standard is assessed.

Simple and Straightforward: Author's Purpose + Main Idea. Y'all - this is NOT about PIE. TEA explained this in a presentation to CREST. Basically, kids are reading the verb and making their selection. They aren't reading the rest of the stem. In addition, these questions address purposes of sections of text and why they were included as well as the purpose of the entire purpose of the passages.




As a matter of fact, Sarah and I found very little PIE going on in the verbs used in the stems or answer choices. Here's a list of the words we DID find:


Author's Purpose Given + Main Idea + Text Structure Sometimes, the questions tell the students what the purpose is. Incorrect answer choices give viable/true details while correct answer choices reference the main idea of the whole text. In some questions, answers also imply text structure - linear/chronological, cause and effect, etc.

Author's Purpose + Main Idea + Text Evidence These questions give text evidence in the stem and/or answer choices. The answer choices will include a verb about the purpose and add content that references the main idea or viable details from the passage. Sometimes, the questions even name a move that the author makes (specific language used, directly addressing the reader, etc.) and asks why the author would have done such a thing.

Making Inferences about the Author's Purpose for Including Text Evidence Students are asked to evaluate why an author would have included a piece of text evidence. Answer choices list viable details, but only one of them is directly connected to the main idea/purpose of the entire passage and the particular piece of evidence cited in the question.

I've created a slide show with examples and question stems that you can use to create lesson plans and add this level of rigor to your existing performance assessments. 

WE MUST expose students to the type of thinking they must do when they read and write in real life. And that's never about PIE. And when we move beyond that - we'll see better results on STAAR too.




No comments:

Post a Comment