School's back. When last year's AIMS scores are soon released, we'll discover which schools are "failing." Meanwhile, high school students, whose scores on AIMS in grades 3 through 8 have no bearing on their report card, and don't even arrive home until weeks after school ends, must pass AIMS or possibly not graduate.
We need change. Mesa school board member and state Rep. Rich Crandall has ruffled the feathers of state Superintendent of Public Instruction Tom Horne. Crandall's legislation, tacked onto this year's budget at the last moment, limits the next contract for a corporate AIMS test provider to one year and creates a committee to evaluate AIMS as a high-stakes test for high school graduation; he's stepped in a good direction, but just not quite cast the net wide enough.
We need to re-evaluate the entire structure of Arizona's Instrument for Measuring Standards. On its face, who wouldn't want to measure standards to assess student, teacher, school and district performance?
But can 68 multiple-choice questions measure 82 standards?
Take sixth grade math, for example. Five strands are broken into 17 broad concept areas where 82 specific standards are placed. For instance, determining the area of triangles is a specific standard under geometry. AIMS uses 11 questions to measure 13 concepts in this broad concept area and reported out as percentage correct.
Simple math tells you, at best, two specific concepts aren't measured at all. Furthermore, no teacher assessing if students understood how to measure the area of a triangle would use only one question. She'd probably try different contexts, use different kinds of triangles and give students different information to ascertain how well students understood it.
Furthermore, this standard lacks clarity. The area of a triangle is one half base times height, but the height is only a side if it's a right triangle. Do we want students to be able to calculate the height? Or will it be given?
The area of triangles is conceptually linked to the area of parallelograms and rectangles (also on the standards for the sixth grade). Students ought to be able to explore this kinesthetically. If you put any two identical triangles together, you'll get a parallelogram (whose area is base times height). And if you re-arrange pieces of the quadrilateral, you can turn it into a rectangle whose area is length times width, a formula which can easily be demonstrated on large lined graph paper.
Now show that with a multiple-choice question!
Multiple-choice questions don't allow us to see enough of how students think through problems; the answer choices give clues to some students as to what the right answer is even if they don't fully understand the concept - and others can be tricked by similar answers or understand a concept but make one error and get no credit.
Except for writing, where we use one essay to evaluate a student, every question on AIMS is multiple choice - and now the scary part: We're going to add science to the mix. Any scientist should cringe at the thought of using multiple choice to evaluate science. That's better for Trivial Pursuit.
In short, standards need to be honed for each grade into fewer essential elements that are truly assessed by AIMS and other classroom-based accountability measures. Assessment needs to be balanced in a manner that encourages deeper, thoughtful thinking, not simply memorizing a formula. We need to add open-ended items to AIMS, get AIMS results back to students, teachers and administrators before the school year ends, and make sure we augment AIMS by other means of accountability.
Many of these recommendations are found in the report of the Commission on Instructionally Supportive Assessment - and that would be a good place for Crandall's commission to start.