Saturday, February 13, 2016

What if Multiple choice questions allowed teachers to SEE their students' thinking instead of just the purported result of that thinking?

An interesting thought.

A couple disclaimers:

  1. It is late at night/early in the morning, and I am a bit fogged because of some cold-symptoms...the chance of this post containing non sequiturs and general rambling is high
  2. I do not have an answer to this question, so the following post is more of a thought experiment. If YOU have the answer, share it by commenting on this post.
Here is what made me think about this question in the first place.  I like the Mastery Connect tool, but I know that many are still warming up to it.  Part of my role this year is to check in with teachers about their use of assessment data to drive their instructional choices, and in doing so I get to hear some interesting feedback.  

During one conversation a math teacher bemoaned (rightfully so) the seeming dependence on multiple choice questions as the method of assessment.  Yes, there are other types of questions, as you all know by now, but when it comes down to a quick turn around a solid multiple choice question is hard to beat.  What I believe this teacher was responding to, was the way in which a multiple choice question's response is so easily divorced from the the thinking that ostensibly lead to it.  Was it a guess? Was it a happy accident?  If wrong, why?

At present (so far as I know) Mastery Connect does not allow you to capture student's work alongside their multiple choice responses.  You may be thinking, "so what? You can always collect their 'scrap' paper or test booklet/sheet, and review that for a more robust picture of their current state of mastery."  You are certainly right, but what if you did not have to, or more to the point...did not have time to?

Ok, I am going to jump to an idea I have been playing with in my head for a little bit.  It is important that you do not shuffle the story you just read off to your long or short term memory, but that you keep it in your active working memory so that you can see the connection (that I hope I am able to make).

I like small groups, and I think most do.  As a teacher you can really be efficient with you instruction, targeting precisely what is needed and by whom.  It seems to be a win win.  Lately, I have been playing with mid-class checks for understanding: little mini-quizzes that might seem to have a more likely home at the end of class in the form of an exit ticket, a mid-ticket, if you will.

They are uniformly short, so that the teachers can quickly score and sort them into groups based on the results.  To do this requires very precise questioning so that you can be sure that when a student misses question 1-3 they need the type of intervention being offered by small group A, whereas students missing questions 4 or 5 need what the other small group is offering.

This works well when there are multiple teachers in the room, and allows misconceptions to be cleared up right away so that students do not lose too much ground while they struggle until that difficulty is noticed.

What if I did not have a co-teacher?  Sad, yes, but not the end of the mid-ticket, for I can use Mastery Connect to deliver the assessment digitally.  That, in turn, allows students to see their level of mastery immediately upon submission.  If I set some criteria for 1-2 small group stations in the room (ex. re-teaching, partnered, and independent) students can self-sort on the basis of their own data. 

Now for the acrobatics.  Put the two stories together; that of the teacher wanting to see student thinking, and of the teacher trying to create data-based groupings on the fly.  

What if Multiple choice questions allowed teachers to SEE their students' thinking instead of just the purported result of that thinking?

Suppose you created a short assessment wherein each multiple choice question surfaced a student's knowledge of the process. The question is not about the final answer, but about a step that would need to be taken in pursuit of that answer.  In short, their thinking. Here I am envisioning the exploded views brought to us by the world of technical drawing.

Think of a multi-step problem.  Each step can be demanded in the form of a multiple choice question, and can be done in a few ways.
For example:
  1. Take one problem and explode its steps into the appropriate number of questions and:
    • ask students to choose the response that best shows what the proper step/work looks like
    • ask students to identify the statement that best describes the step necessary to solve the problem 

Directions: Your answers to the following questions will demonstrate your knowledge. Each one represents the completion of a step in the process needed to solve the equation below:

4x - 12 = 12

Question #1:
After completing the FIRST step in solving the equation you will be left with which of the following:
a.) 4x + 12 =  0
b.) 4x = 24
c.) x = 6
d.) -12 = 12 - 4x

Question #2:
After completing the SECOND  step in solving the equation you will be left with which of the following:
a.) 4x = 24
b.) 0
c.) x =6
d.) 4 = 24

Question #3:
After completing the THIRD  step in solving the equation you will be left with which of the following:
a.) 4x = 24
b.) x= 6
c.) 24/x = 4
d.) x = 3


Directions: Your answers to the following questions will demonstrate your knowledge. Each question will ask what is needed next to solve the equation below:

4x - 12 = 12

Question #1:
What is the FIRST step that you should take?
a.) Isolate the term 4x by adding 12 to each side of the equation.
b.) Divide each side by x  remove the variable from the left side of the equation
c.) Subtract 12 from each side so that you have zero on the right side
d.) Multiply both sides of the equation by 4 to get rid of the variable

Question #2: 
What is the SECOND step that you should take?
a.) Multiply both side by 4
b.) Divide each side by 4
c.) Divide by zero
d.) there is no next step

Question# 3:
Finally, what is the correct answer?
a.) x = -12
b.) x = -6
c.) x = 6
d.) x = 24

I am certain that there are many more ways to ask students to demonstrate their knowledge of processes and, correspondingly, their thinking.   A math teacher could differentiate the same assessment by making it visual and actually providing students with the "notebook" page that SHOWS a problem in various states of completion, and thereby eliminate the classic barrier to math knowledge demonstration that literacy often presents.

This approach of taking an exploded view of a mental process is immediately applicable in science curriculums wherein students must follow procedures, or conduct observations and experiments in a specific sequence, but how does it translate to the more nebulous world of humanities?

Again, I am not sure that I have an answer, but I think that in classrooms all over the country ELA teachers are instructing students to analyze text in specific ways, and the same is true for writing. After all, much of teaching ELA is about slowing down the mental processes our adult brains go through as we analyze text and modeling that for students in the form of discrete steps.  Therefore analyzing the development of Jean Louise Finch is as much a product of systematic thinking as the graphing of linear equations.

There is simply no way that this is the first time something of this kind has been written, it is even more unlikely that I am the first to have thought about this, so please help push my thinking further and add your comments to this post.

One last disclaimer:

3. Somehow I have been listening to Conway Twitty the entire time I was writing this...especially odd, since I have only three songs of his mighty catalogue.  Doubly odd that I only recognized this here at the end. I don't know what happened.

What's that Darlin'
How am I doin'
Guess I'm doin' alright
Except I cant sleep
And I've blogged all night 'til dawn.

Friday, February 5, 2016

8th Grade Winter 2016 ELA Simulation Item Analysis

The 8th grade team, as is true for many of you as well, sat down to undertake some item analysis of our Winter 2016 ELA Simulation Exam [Day One]. What follows is a reflection of the process that we went through.

Before the meeting the team ensured that all student answer documents had been scanned into Mastery Connect so that we would be working with the most complete data set for the 8th grade team as possible.  Our simulated exam targeted many Common Core Standards under the umbrella of Reading Literature and Informational texts.  In order to create a manageable data set the team narrowed its focus to the first 23 questions, which were based on 4 passages (3 nonfiction, 1 fiction).

Our expressed purpose in conducting the item analysis was to surface and trends or patterns in our students reading skills that would allow us to strategically address those learning gaps or build on the places of relative strength.

Our meeting began by orienting ourselves to the data in a low inference way, wherein team members (with the Mastery Connect generated report in front of them) shared initial observations and noticings.
A few of those initial observations were:
  • There seems to be a big difference between literature and informational (see questions 1 and 3 and 20)
  • Question 6, 17
  • Question 5 has an interesting pattern

We transitioned from our low inference-inspired observations to a more critical and analytical mode, aided by the "Multiple Choice Item Deconstruction Chart". We followed the protocol and soon discovered that students struggled, universally, on questions that were tied to RL/RI 8.4. These questions are about defining unfamiliar words using context; something that we surfaced as an issue in September after our Baseline exam. Confused, because we triaged that particular issue early (and often), we dug deeper.

Looking at the particular items in question another interesting pattern came to light. Questions that asked students to define unfamiliar words were answered at a fairly high rate. However, also bundled in the 8.4 standards is a focus on figurative language. Soon we realized that the questions that students were struggling with (in all three classrooms) were those that asked students to engage with figurative language, and in some cases using figurative language to identify unfamiliar words (a double whammy).

Given that this pattern extended across the whole grade, we as a team, developed a response that took advantage of a common grade structure that holds one period a week in reserve for independent reading. During this class we decided not only to employ a bit of item analysis with our students, but to also address the way to analyze figurative language. Furthermore, we identified several key lessons in the upcoming unit wherein we can check in on the progress of student's skill in analyzing figurative language. We plan to include several short responses to Laura Hillenbrand's use of figurative language, organize at least one accountable talk about the how figurative language is used to develop the characters in the narrative, and asking students to generate their own figurative descriptions of what they are reading. In this way we will be addressing this instructional need in a myriad of ways and modalities.

Our final response to this discovery was to manipulate the standard itself. Instead of having RL.8.4 include unfamiliar words, figurative and connotative language, etc... we are going to create sub-standards to allow for a more granular collection of data. We have done this in the past with success, and look forward to the ways in which more discrete data will enable more targeted instruction in the weeks to come.