CREST, 9/23, 2019 Notes and Quotes. The
following notes were taken at a presentation by TEA staff at the Fall
conference for the Coalition of Reading and English Supervisors of Texas. Notes
are summarized by topics addressed. Statements
in quotations and courier font
are verbatim by speaker. Notes were reviewed by CREST Board.
Tyson Kane TEA: Born and Raised in Clear Lake,
Houston; Clear Creek ISD. University of Texas; Private Sector for a while; then
taught High School in Watts, Los Angeles, California. Taught mostly math and
science. Then moved to South Side Chicago, opened a high school and ran that
for 7 years. Also taught 9th grade reading during that time. Became
a superintendent in Chicago. Then came to Texas.
Introduced Shelly Ramos: Oversees TEKS and
content review for assessment items. Born and raised in San Antonio. Went to
Texas A&M.
TEA Strategic Plan: (Kane) https://tea.texas.gov/About_TEA/Welcome_and_Overview/TEA_Strategic_Plan; Oriented
Assessment within that plan with the TEA priorities and levers.
Took this role because “Good assessment is important for great teaching and instruction.
And right now, we have an assessment that is aligned to the TEKS that is
cheap.” (multiple choice assessment is less expensive than other types) “And
that’s why it’s designed the way it is. And that’s not the best design. Because
it is multiple choice, and it is very limited in being able to diagnose true
student understandings, being able to do so in an authentic ways, that are more
closely aligned to how we as teachers instruct, or how valuable [they are] from
a formative standpoint to help us know if we are on the right path toward the
targets we want to hit.”
“So, the good news is that in the latest legislative session, we
received an injection of funds because many folks are saying, ‘this will not
do.’ And I was excited to see that we can make some changes. I’m excited to be
on this journey with you, which is part of the reason I came here today.”
Texas Assessment Program Components – (Kane)Reviewed all the testing components:
STAAR, STAAR ALT II, TELPAS, TELPAS ALT (we are one of the few states that are
addressing that need), Interim Assessments, and NAEP.
Interim Assessments: “Let
me make a quick statement on interim assessments, just because I have you here.
There’s a good way to use interim assessments. And a bad way to use interim
assessments. An interim assessment is an assessment that is very good for progress
monitoring – telling you at across all standards: ‘Are my students progressing
across those standards during the course of the year? Are they on target to
achieve an end state performance? (must be indicative, predictive of STAAR
performance) They are really good for that.
“They are NOT as good at digging into causes of
instructional misunderstanding. Why? Well, let’s think about it, if we are
covering all the standards on a single test, and that test is pretty short,
then you’re not going to get more than about one question per standard. Maybe
two. We all know that if you look at it and say, ‘Well, gosh, they didn’t
master this standard. We have no percentage mastery on that standard. I have to
reteach that standard.’ Well, do you, though? One question is not necessarily
the most reliable way if you are looking at an individual standard.
“As it turns out, [interim] assessments are a very reliable way to
look at all the standards together. Because that’s what you are
assessing in terms over overall progress. I always caution people when they
come and say, ‘We dug in and we saw that these are the places that we are not
doing so well on our standards. We have to reteach these three.’ I say, well,
maybe. Maybe. You also need to inform your instruction with your classroom
assessments. Do they line up with that [data]? Your formative assessment tells
you the places that you should be putting time and energy, or not. But it’s
great for monitoring at a high level. When you are thinking of your whole class
for intervention groups, grades, or periods, [interim assessments] are the best
aligned that you will find. And they are free!”
Student Assessment
Division: Curriculum and Standards, Administration, and Reporting These are working differently than they have in
the past. Mr. Kane is the Associate
Commissioner and oversees a portion of the Assessment Division and coordinates
the divisions. Shelly Ramos worked in the Curriculum and Standards division.
She works with test development for STAAR for every content area. (Monica
Martinez works in this department.) Student Assessment is the next department.
This department covers test development for the other tests – TELPAS, TELPAS
ALS, STAAR ALT, and the administration and scoring of these assessments. (Noted
that the agency works with vendors and teachers to write the questions and
review questions. No one at TEA is sitting in the closet writing test items.)
The Performance Reporting group is led by Jamie Crow. (They are the ones that
developed the A-F online reporting tools.)
HB 3906: (Kane)This is the largest assessment related
bill that passed.
·
technical
advisory committee continues (Kane)
renowned experts consult with the agency for test reliability and validity
·
new
formal committee for educator advisory begins; (Kane) will be put in place this fall
·
interim
assessments continue: (Kane)
will be funded; must be predictive of STAAR, not to be used for accountability,
electronic administration only
·
multiple
parts over multiple days for assessments (Kane) (NOT THIS YEAR, could be optional next year, time
limits for each part; tests could happen throughout the year as well; much is
still to be worked out)
·
integrated
formative assessment pilot established (Kane) (could replace STAAR someday; could go
many ways, not ready to pilot with optional participation until next year; this
is their design year; will be reaching out for feedback about design)
o
“…start to create
formative assessments that can be used for summative purposes. There’s a lot of
ways this could go. And this is in its early stages right now. We wouldn’t even
start to pilot something like this with optional participation until next year.
This is a design year.”
o
NOTE:
In this section, he is establishing a HYPOTHETICAL example. He’s not saying
this is what the test will look like. “What ways would you go
about it, if you could say, ‘here’s a test that could have multiple parts, that
you could take throughout the course of the year, and would count the same way
as an end of course…what would you want it to look like? What design elements
should be there? Should it be a competency-based model? Should it be something
that has set curriculum that people could opt into and say, ‘Gosh, I would love
to know if To Kill a Mockingbird is going to be on [the test]. That
would be helpful to me. And if it were related to what my kids were reading
already…well, finally, I can bring in background knowledge and some vocabulary
in for this.’ As a reading teacher, I fully well know that without that, my
reading is going to be limited. Just by having cold passages every time, the
reading skill [is more complex]. But those of us out there that are reading
teachers know how you don’t teach reading is… hint…is to drill people on
main idea. Or skills-based things. Finding inference. Find the inference. Find
the inference, I’m talking to the group that knows. When I present to the math
group, people look at me and ask, ‘What are you talking about? You just
calculate the frequency then they can answer main idea correctly.’”
o
Established again that
he was excited about what this could look like and requested our help in
designing what the assessment could become.
·
electronic
administration (Kane) required by 2023
o
but – there is a
legislative session before that date, so things might change, not a done deal –
just directional; they are working on a report to the legislature about the
feasibility and district preparedness; might recommend legislative changes or
resource changes to allow this to happen; Kane is a fan of online testing if the
glitches in administration are resolved. Zero central testing issues were
experienced last year. Allows faster results, flexibility of test items that
could be better for kids and truly measuring their knowledge
·
standalone
4/7 writing eliminated in 21/22 (Ramos)
o
BUT feds want writing in
all grades; educators will design and give feedback;
o
implementation for 3-8
is NOT this year, in 21-22, WILL have writing field test items for revising and
editing THIS year for 3,4,5,6,7,8.
o
Will have revising and
editing passages built into the test, but won’t be intermixed with reading to
avoid confusion.
o
Revising and Editing
will be its own section;
o
By the time test is
fully redesigned in 22, the agency will have items ready in the test bank;
o
Have already had committees looking at the
items
·
cap
on 75% multiple choice questions (not this year; by 2023). (Ramos)This means items, not points.
Right now, we only have one non-multiple-choice item in 4th and 7th,
EOC I, II, and III – the written essay –
o
right now; agency is
getting opinions;
o
most of these are very
expensive; multiple choice is cheap;
o
trying to do things that
are great for kids and match to the standards; tech enhanced, short answer,
variety of formats, we don’t know what the item types will be; hearing strong
support for short answer right now; share your opinion;
o
They plan to share lots
of practice items so kids will be familiar with the format and how they might
approach the problem; looking at an accelerated timeline to redesign for 2022.
Note: this is for all grade levels and content areas.
o
they are looking at what
items are best for kids and the connection to the curriculum as opposed to what
could be done.
o
Getting a lot of support
from the field about short answer; share opinion with them about your ideas
TAC: section 39.023A; (Ramos addressed this) We must assess the depth
and breadth of our curriculum. Federal law requires us to assess the depth and
breadth. Currently, listening, speaking, and research and inquiry are not
assessed on STAAR. The agency is looking at the best way to assess that, and if
STAAR is the best place to assess those standards or some other kinds of
assessments. Note: there are other subjects that have these issues. They are starting
with writing. Inference: Expect other content areas to change and address more
of the curriculum.
Participant pointed out:
Listening and Speaking are embedded in every strand, not as a separate strand; must
show integration. Ramos responded: The agency is still looking at how they will
address the integration of the strands and domains of literacy.
Mr. Kane addressed
questions sent to him prior to the meeting and those arising from the audience
RLA redesign: No blueprint change or reporting category
changes for 2 years; 2-year overlap; 3-8 overlap this year and next year;
only one overlap year for 9-12; no new standards will be assessed; teachers
need time to adjust to new standards
Question from Audience: Side
by side document was removed; (Kane) stated the documents will be ready end
of this month, September.
Question from Audience:
Will the side by side list include reporting categories? (Ramos)
eligible TEKS are listed, will be same reporting categories, will retain readiness
and supporting
(Kane) There will be no
moratorium on accountability. – no; timing of overlap is the reason;
they have sufficient assessment items to keep the test going unlike what they
had for when math had new standards.
Overlap EOC retesters – Which tests will they take? (Ramos):
looking into it; looking into it but have no firm work; planning to just use
the overlap standards where students have received instruction in that content;
Field test items for
writing – (Kane) part of the
reading test as always; not all kids will see writing items; some kids will see
writing items, some will see reading field test items; length of tests, numbers
of questions, and passages does not change
Coding of questions: Will there be dual coding? (Ramos) Not during overlap/transition years;
it may go back to dual coding, but we don’t know until the test is fully
redesigned;
8th grader
taking English I course:
What test do they take? (Kane) They take the test for the course they
are enrolled in. Do not need to double test. May change. If it changes, it will
be due to ESSA and federal requirements;
Scoring essays: Since writing is going to all grade levels,
can (actual) teachers score the papers? Mr. Kane answered –can’t promise
that ALL scorers will be teachers; it’s a process of validity on the grader –
they knock them out or intervene with multiple strategies to keep things valid:
monitoring process, tightly monitored, rating isn’t changed over time, 3rd
rater, rescore options, etc. There are multiple failsafe’s to keep the process
valid.
When will we see the
updated list of universal screeners? revision to screeners to be implemented are in a small paragraph
in the HB3; TEA communication is forthcoming;
2019 released STAAR
tests: will be available
during October; pdf documents will be available, will not have the option to
have an online version/practice test or the ability to order printed copies;
rolling basis as soon as they come out; (reason it didn’t come out when it
usually does is because of funding years and availability of money; wasn’t
enough to fund everything they normally release)
No change in test length
(Kane)
25% options are to be
determined: (Ramos) they haven’t
been designed yet; we don’t know what kinds they should select; they want
feedback about what we think would be best.
Will field test items be
weighted – (Ramos) no, they are
not scored, so there is no need for weight during the transition years; there will be weighting assigned at standards setting when the test is fully operational
Field Test Items: (Ramos) 3rd-8th grades
will have a single additional field test passage for editing, revision, or
reading; not any different than what we are doing now in length
o
revision passages are
longer than editing passages;
o
not every student will
see one off those items;
o
field test items are one
passage with one to six associated items;
2022 – when writing
becomes operational 3-8, writing will be a component of score to pass like it
is in the EOC; decision will be made after standard setting; lots of
opportunities for input over time. (Ramos)
Will we have a
composition this year and next year: YES. No changes expected. They see an
alignment with informational and expository; (Ramos) And all the kids 3-8 will have only one of three field
test types of passages: revising, editing, reading.
Will we have prompts for
3-8 going forward: (Ramos) Not for the
transition years. Just field testing for revising and editing. There is not a
redesign plan finalized for what will happen in 2022. This still must go
through educator committees, agency protocols, and commissioner approval. The
new blueprints have not been designed. We don’t know. It’s a possibility. But
we don’t know. They keep hearing talk about short answer.
(Kane) Explained the
process. Legislature passes bills. Then TEA has a rule making phase to
write how they will implement the laws. They are in the rule making phase right
now. It takes time – especially when they are checking with all the
stakeholders to create the right decisions.
Are readiness and
supporting changing? No. Not for the
overlap/transition years. These will be on the side-by-side.
Will the Raw score
change for 2020 STAAR? (Kane)
Maybe. Probably. Depends on the complexity of the test. Gave a lovely example
of weighing grapes. The grapes should weigh a pound of difficulty. The number
and size of grapes might differ. But you still want to pay for a pound. Same
for tests. You want to get a pound of difficulty for each test.
Argumentative/Persuasive:
Informational/Expository: What language should we be using? (Ramos) “One
of the things we did as we were working on identifying the overlap document was
to conduct meetings from educators from across the state to get their input on
the overlap standards that we were recommending. We asked a very similar
question. Overwhelmingly, every grade level said that they don’t see an
alignment between persuasive and argumentative for the terms of assessment. So
you won’t be seeing either one for the transition years. We had the
conversation with four different groups. So, in that sense, you won’t see
language on the assessment for persuasive or argumentative. It won’t show up on
the exam. Students won’t be assessed on persuasive. Students won’t be assessed
on argument. For two years. For English II, that’s still a question that we’re
in the stage of trying to finalize an answer on. Question is English II
persuasive prompt…we are working on final answers for that and we hope to have
that for you, probably in the month of October.”
Readability Study: (Kane) The agency has been tasked to
study this issue by HB 3. The University of Texas is studying the 2018 and 2019
tests to consider the readability. In the test development process, they are
using multiple quantitative and qualitative measures to evaluate readability,
including teacher opinions.
Time Constraints and
Complexity of the Test: (Ramos)
No student will see all three types of field test items and their tests will be
the same length of everyone else’s test.
Passages Connected to
Content Area Topics via Grade Level Science and SS Topics: (Ramos) They started early passage reviews for
these the last two summers. Field tested in 2021. Operational 2022. Caveat:
teachers on the committees tend not to like these passages. Thought they were
not engaging and interesting to students.
Expository/Informational:
(Ramos) “As you know, we are currently assessing the composition in the
expository genre. When I mentioned that we met with a lot of teachers since
July, they indicated a strong alignment between expository and informational. So,
at this time, unless there is some significant concern that we hear, we are
going to proceed with aligning expository to informational on both the
composition and the reading passages/genres.
In terms of test
redesign, does the standard setting change with meets, masters, and
approaches? (Kane) Yes. In 2022.
Questions about the percentage
of passing from the old STAAR performance standards and the percentage on the
new performance standards: (Kane) Yes, some transferability should happen.
Was not aware that other content areas have such lower passing standards than
ELAR. We have been told that constructed response items in ELAR made our
passing standard is. TEA is also working on how the 25% non-multiple-choice
items will be weighted in the standard setting process. Kane did not know that
other content areas were held to such lower standards. He is going to investigate
that. (He was extremely good-natured and handled this question well.)
Comparisons between
Spanish and English STAAR: (Ramos)
They are studying readability and comparability to Spanish in the same way as
English, but on a different timeline.
Composition Genres
across grade levels: Will there be a shift in genre (from informational to
argumentative) for compositions in the re-design? (Ramos) There will be no changes for two years.
“For the re-design, I will say anything is on
the table.” Give feedback to them.
You can participate in focus groups and sign up to participate in committees.
Integration of Reading
and Writing: (Ramos) “Until the test is re-designed, we don’t know. I would imagine that one of the possibilities is a reading section and a writing section much like
we have on the EOC. But anything is possible at this moment. That’s going to
depend on feedback and advice we get from educator committees.”
Asked for help in
re-designing English III test. No one is applying. Please help.
Kane and Ramos thanked us for all of our hard work.
No comments:
Post a Comment