As a high school science teacher, I often posed exam questions that required application of the many facts that students learned. Usually I finished grading the exams feeling quite frustrated. The students didn't seem able to use what they had learned in any novel way or new situation. Why couldn't they do tasks that required them to see relationships, compare, or make decisions?
The answer came to me when I began working with the authentic task assessment model that the Mid-Continent Regional Educational Laboratory (McREL) developed for the Aurora Public Schools. What students had learned was missing a crucial piece. I had not taught them the thinking processes that they needed in order to use the knowledge that they had acquired.
The McREL assessment model identifies 14 complex thinking processes: comparison, classification, structural analysis, supported induction, supported deduction, error analysis, constructing support, extending, decision making, investigation, systems analysis, problem solving, experimental inquiry, and invention. These complex thinking tasks can be integrated into a range of authentic tasks and assessments of performance on those tasks.
The timing of my discovery was opportune. The Aurora school district, like many others across the country, had been looking for ways to restructure the curriculum so that it would present more challenging and engaging tasks to the students. Fortunately, the district was able to provide funds to underwrite time for teachers to plan and work collaboratively. Pairs of science teachers at Gateway High School were released from the classroom for two days to design authentic tasks and assessments in their content areas.
My partner, Bob Legge, and I worked together to design an assessment task that would integrate one of the McREL thinking processes into Bob's advanced biology class. Since the class was ready to begin studying the uses and regulation of biotechnology, we set out to design a task that would allow students to demonstrate their biotechnological knowledge and simultaneously incorporate a complex thinking process.
Decision making seemed to be the most appropriate thinking process to concentrate on, because it is a way to resolve complex issues that have no clear-cut answer, the very kind of task encountered in determining the uses, possibilities, and limitations of biotechnological applications.
Bob and I recognized that students should be given opportunities to practice with the form of an assessment before it is used. In other words, they need time to practice the decision-making process without also having to attend to content. Consequently, we had students practice making and using a decision matrix that would determine what to do about a hypothetical American hostage situation in Iran (Iozzi and Bastardo 1990). The students were to advise the President on a course of action that would best resolve a situation in which passengers on an airliner were seized as hostages.
A decision-making task requires students to evaluate a list of alternatives that are scored using carefully weighted criteria. In our practice run, students' choice of alternatives (that is, their decision) was based on the alternatives and criteria that we provided. The alternatives to be considered included military action, economic sanctions, a blockade, doing nothing, giving in, and diplomatic intervention.
The selection criteria included hostages' safety, public opinion, world opinion, economic repercussions, potential for war, and more kidnapping (Iozzi and Bastardo 1990). With teacher guidance, the students determined the weighting factors by deciding which of the criteria were most important to a good decision (the weighting factors are proportions of 100 percent; they add up to 1). Students also give each alternative a score, based on the extent to which each choice possessed each criterion. Each score was then multiplied by the weighting factor to obtain a final total score for each alternative. The alternative with the highest total is the decision. These scores were based on previous student knowledge (but, in later assessments, on their research and reading).
The students then began developing the matrix that would determine the best advice to give the President (see fig. 1) [figure currently unavailable]. The first decision reached was to do nothing. The students were very surprised.
After much discussion, the students decided that the original weight that they had assigned to the world opinion criterion was not heavy enough. A change in the weight of that criteria resulted in a decision to use military action. The students' alteration illustrates an interesting feature of making decisions by using a matrix, and that is that by changing the weights assigned to the chosen criteria, students can see the immediate effect on the decision.
Once familiar with the process, the students were ready to apply their decision-making model to the content in biology. In preparation for this authentic task, the class read and discussed material on genetic engineering and biotechnology, including the late Judge John Sirica's decision to prohibit experiments that would release engineered organisms into the environment. The issue presented to the students was, who should regulate and monitor biotechnology— society or the scientific community (Iozzi and Bastardo 1990)?
Working in cooperative groups, the students determined the alternatives, the criteria, the weighting factors, and the final decision scores. A decision was generated, and as a group, students shared their decisions orally with the rest of the class.
Now that the students had practiced on two types of decisions, they were ready to begin the authentic assessment. It would allow them to demonstrate their decision-making skills and their ability to integrate and apply these skills in new situations. Students were allowed to choose among several controversial issues in biology. The issues included in vitro fertilization, alternatives to animal experimentation, and organ donation.
We provided students with initial background information on each issue (the data were taken from Taking Sides: Clashing Views on Controversial Bio-Ethical Issues, Levine 1990). Students cooperated in gathering further information about their issue. We encouraged them to seek information from a variety of sources.
In the assessment, students were to develop a decision-making matrix using criteria, alternatives, and weighting factors to generate scores based on the information that they had gathered. We also asked the students to prepare a visual representation of their decision. It was to be used as groups presented their issue and decision to the class.
To help students develop their representation, we showed them how to change their decision matrix into a Microsoftworks computer spreadsheet. A student-generated spreadsheet on the issue of organ donation lists the alternatives considered: (1) leave the laws as they are currently written (do nothing), (2) harvest organs only with prior permission, and (3) stop harvesting organs entirely. The criteria for making the decision were: (1) improvement of lives, (2) prevention of deaths, and (3) ethical beliefs.
The students' conclusion was interesting and noteworthy. They decided to let the laws governing the removal of organs stand as they are presently written.
We also instructed the students on the computer procedure for converting spreadsheet information into a graph. Once the spreadsheet and graph data were entered, students were able to print both the spreadsheet and graph on a transparency. The students thus created a professional-looking visual aid to help them lead the class discussion about their issue.
The Assessment of Decision Making
- Did the students' final recommended decision match their chosen criteria?
- Did the students apply accurate and important biological information to the process of decision making?
- Did the students select appropriate and important criteria with which to assess the alternatives?
The McREL assessment model also specifies a four-step rubric. It describes observable characteristics by which student performance can be gauged against each of the assessment criteria (Redding 1992). For example, our final criterion for the process of decision making read, “Did the student select appropriate and important criteria with which to assess the alternatives?” The rubric for this criterion reads: Level 4: The student clearly and completely identified the criteria by which the alternatives were assessed. The criteria were presented in detail and reflected an unusually thorough understanding and concern for the repercussions of the decision.Level 3: The student clearly identified the alternatives to be assessed. With no significant exceptions, the criteria were appropriate to the alternatives and important to the decision task.Level 2: The student correctly identified the principal criteria by which the alternatives would be assessed. Some criteria might be omitted, or included criteria might not be important factors for consideration or entirely appropriate for the decision.Level 1: The student specified criteria that were not appropriate for the selected alternatives or of importance to the decision.
Our students knew that level 3 is the standard, and all work is expected to reach that quality. Level 4 is exemplary work, and we encouraged students to make that their goal. Both Bob and I assessed the results, but students also used the McREL rubrics to assess their own performances.
What Is the Difference?
This authentic task and the assessment of it was an excellent way for students to both learn information and grasp the relationships among the data. Instead of just collecting and sharing factual information with their classmates, students needed to synthesize what they had learned, integrate it into the form of a decision, and then justify that decision to others. The students also learned a great deal about the process of making decisions with no clear-cut answers.
The conversations that took place during cooperative learning were lively. They engaged every student. Furthermore, students took responsibility for some of their own learning, especially when they orchestrated the class discussion.
The preparation for the assessment seemed to work very well. The pre-assessment activities provided students with all the skills and experiences needed to perform successfully in the assessment. Students also knew all of the assessment criteria right from the start, so they had a clear, stationary target to aim at.
Students were able to transfer their learning, too. A few days after the student presentations, a representative from the Aurora Water Department was a guest speaker in the class. She presented material on how water was delivered to the city. Then she asked the students what they thought her real purpose was in visiting the class that day. Confidently, a student answered, “To talk about making decisions about water use.” The surprise on the face of the guest speaker was apparent.
The animation of the students was even more unmistakable. Eagerly, they helped the guest speaker generate a list of alternate ways to supply Aurora's growing demand for water and a list of criteria by which to judge the appropriateness of each alternative.
Finally, this assessment procedure exposed students to a new way to use technology, namely creating a high-quality product to use in a classroom presentation. Though many students knew about word processing programs, none were familiar with either the spreadsheet or graphing applications.
In short, we think that authentic tasks and assessments, one such as we have described, will help our students graduate better equipped to handle the complexities and uncertainties of the 21st century.