HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
June 10, 2021
Vol. 16
No.

Assessing Deeper Learning After a Year of Change

author avatar
author avatar
Though it may be tempting to try to teach faster to "catch up" on missed content, the best course of action is to slow down and go into greater depth.

AssessmentInstructional Strategies
McTighe_Gareis_Express_16_19_Header
The profound disruptions to traditional schooling this past year leave educators with a unique opportunity to redefine school when we resume in the fall. Though it may be tempting to try to teach faster to "catch up" on missed content, we contend that the best course of action is to slow down and go into greater depth to ensure that students learn the most important ideas and skills deeply and are able to apply them in meaningful ways.
Others have written about the instructional approaches needed to achieve deep learning (McTighe & Silver, 2020; Fullan, Quinn, & McEachen, 2019; Mehta & Fine, 2019). But it is especially important to consider assessment when thinking about deeper learning.

Deeper Learning, Unpacked

The National Research Council (2012) defines deeper learning as a "process through which an individual becomes capable of taking what was learned in one situation and applying it to a new situation" (p. 5). We propose that a modern education should prepare students to be able to apply their learning to new situations—in other words, to transfer. Grant Wiggins (2012), co-author of Understanding by Design, highlights that transfer exists when a student makes use of their learning in a time, place, and circumstance different from that in which the learning first occurred.
But how, exactly, will we know that students have learned deeply?
This question reflects an if-then logic: If we believe that preparing students to transfer their learning is a fundamental aim of a modern education, then we need to collect the necessary evidence to determine the extent to which students can demonstrate deeper learning through transfer.
An unfortunate consequence of the standards-based testing movement is that many schools teach and test grade-level standards in isolation using multiple choice or institute formalized "test prep" assessments that mimic the format of standardized accountability tests. While such assessment methods may provide measures of students' acquisition of knowledge and basic skills, they don't tell us much about deep understanding and transfer.
We contend that transfer of learning is shown best through performance assessments that require students to perform with their learning. Well-designed performance tasks can assess multiple standards and cut across subject areas, prompting complex thinking and a tangible product or performance that allows students to apply their learning. An athlete playing in a game needs to know the rules of the sport, apply the skills, and also display strategic understanding. Though a player could demonstrate knowledge of the rules on a paper and pencil test or through practice drills, the actual game requires the player to "put everything together" on the field. Similarly, effective performance assessments require knowledge, skills, and strategy, applied in context.
Consider the three examples of performance assessment tasks in Figure 1. What must a student do to successfully complete each task? How do these assessments differ from selected-response tests?

Figure 1. Three Sample Performance Tasks

Evaluate a Claim

The Pooper Scooper Kitty Litter Company claims that their litter is 40 percent more absorbent than other brands. You are a consumer-advocates researcher who has been asked to evaluate their claim. Develop a plan for conducting the investigation. Explain how the design of your investigation will enable you to evaluate the claim.

Make Your Case

You have an idea that you believe will make your school better, and you want to convince school leaders that they should act on your idea. Identify your audience (e.g., principal, PTSA board, the student government association) and do the following:

  1. Describe your idea.

  2. Explain why and how it will improve the school.

  3. Develop a plan for acting on your idea.

Your idea and plan can be communicated to your target audience in a letter, an e-mail, or a presentation. Be sure to choose the means of communication that is most appropriate for your audience and purpose.

What's the Pattern?

Part 1 – Interpret the data on incidents of COVID-19 infections and associated mortality rates on each continent for the past three months. (Students are given data sets to analyze.) Prepare a chart, podcast, or newspaper article to help people understand any patterns you detect.

Part 2 – Select four countries and compare their governmental policies enacted to mitigate the infection's spread. Provide an explanation of the link between their policies and the data on infections and mortality rates. Prepare a newspaper article, podcast, or vodcast to present your conclusions.

These examples reveal four characteristics of performance tasks for assessing deeper learning (McTighe, Doubet, & Carbaugh, 2020):
  1. Performance assessment tasks call for students to (1) apply their learning in some context, and (2) explain what they have done. Whether a task calls for a written response (e.g., an academic essay or blog post), a spoken response (e.g., an audio recording or a live debate), or a visual or physical communication (e.g., an infographic or an interpretive dance), students need to convey their reasoning, justify their decisions, and support their interpretations.
  2. Any performance assessment of deeper learning needs to engage the student in transferring their learning to a novel situation, different from that in which it was initially learned. Benjamin Bloom and his colleagues described this application category of their Taxonomy of the Cognitive Domain in 1954: "If the situations … are to involve application as we are defining it here, then they must either be situations new to the student or situations containing new elements as compared to the situation in which the abstraction was learned. … Ideally, we are seeking a problem which will test the extent to which an individual has learned to apply the abstraction in a practical way …. Problems which themselves contain clues as to how they should be solved would not test application" (p. 125).
  3. An effective performance assessment task engages students in complex thinking. We recommend the use of Depth of Knowledge, a four-level framework developed by Norman Webb and his colleagues (2005) for analyzing the cognitive complexity of any task. When the goal is assessing deeper learning, the tasks need to operate at Levels 3 and 4, which require higher-order thinking such as analysis, interpretation, investigation, problem solving, argumentation, and design. Use multiple-choice and short answer test and quiz items (at Levels 1 and 2) to assess factual knowledge and discrete skills, and reserve performance tasks to assess deeper learning.
  4. The best performance tasks establish a "real world" context for application, when learners can effectively apply (i.e., transfer) their learning to realistic situations.

Setting Goals and Tasks

Authentic tasks reflect a worthy goal, a target audience, realistic constraints, a tangible product or performance, and success criteria. They can vary considerably in terms of their time frame, complexity, nature of the products/performances, and whether the targeted content and skills are discipline-specific or interdisciplinary. Such performance tasks can range from conventional essays to open-ended mathematics problems to scientific experiments to a research project to tackling a community-based issue (Wren & Gareis, 2019). Wiggins and McTighe (2012) offer a practical framework captured in the acronym GRASPS that can be used to develop an authentic context for an assessment task (see Figure 2).

Figure 2. GRASPS Framework

- A real-world Goal,

-A meaningful Role for the student,

- An authentic (or simulated) Audience,

- A contextualized Situation that involves real-world application,

- Student-generated culminating Product and/or Performance, and

Success criteria by which student products and performances will be evaluated as evidence of learning.



Figure 3 presents an example of a performance assessment task used as part of the study of a state or province. Can you recognize the GRASPS elements contained within the task prompt.

Figure 3: Sample GRASPS Performance Assessment

State Tour

A group of six exchange students is visiting your school for one month as part of an international exchange program. The principal has asked your class to plan and budget a four-day tour of [your state or province] to help the visitors understand the state's history, geography, economy, and cultural elements. You should prepare a written tour itinerary, including an explanation of why each site was selected. Include a map tracing the route for the four-day tour. Optional extension: Include a budget for the trip.



The authentic nature of performance tasks is often motivating to students. However, when these tasks are used for assessment purposes, they are obligated to meet measurement standards of validity and reliability.
Performance assessment tasks must enable educators to answer key questions:
  • Does a performance task provide the necessary and sufficient evidence to enable teachers (and others) to determine the degree of deep learning of a student?
  • Does the student's performance provide evidence that they can effectively apply their learning to new situations?
  • Can we trust that a student's performance is not influenced (whether positively or negatively) by raw chance, poor construction of the assessment itself, implicit biases, cheating, or inconsistency in the teacher's evaluation of students' work?
A common concern about the validity and reliability of performance assessments is that their scoring is "too subjective." While this can be a potential problem—especially with poorly designed tasks and scoring rubrics—the challenge is not insurmountable. After all, we use judgment-based evaluations routinely in state writing assessments, AP art portfolio reviews, judging in Olympic events, and when we rate a restaurant. Indeed, there are well-established practices that enable performance assessments and judgment-based evaluation to function fairly, consistently, and defensibly (Gareis & Grant, 2015; McTighe 2013).

Deeper Learning for School and Life

There's a cartoon that shows a recent graduate interviewing for a job. Across the desk is a besuited business executive who has just asked a question of the owl-eyed youngster. On the executive's face is an expression of dismay. The caption—spoken by the recent graduate—reads something like this: "Can you give me four choices?"
The comic is a satirical commentary on one of the unintended consequences of high-stakes standardized tests. If we feed students a steady diet of multiple-choice questions throughout their school careers, should we be surprised that they (and the graduates they become) will frame problems in terms of a set of selectable "answers"? The stone-faced expression on the prospective employer's face belies his experience in the "real world," where issues and problems are often complex and do not lend themselves to fixed-response solutions. Today's students need deeper learning to successfully navigate the opportunities and challenges they will face in the world beyond school. Achieving this goal will require shifts in curriculum, instruction, and especially, assessment practices.

As Grant Wiggins reminds us, "The point of school is not to get good at school but to effectively parlay what we learned in school in other learning and in life." Deeper learning enables transfer. Performance assessments give us the evidence that students are indeed learning deeply—and are able to apply that learning in school and life.
References

Bloom, B. S., Englehart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). The taxonomy of educational objectives, handbook I: The cognitive domain. New York: David McKay Co.

Fullan, M., Quinn, J., & McEachen, J. (2018). Deep learning: Engage the world. Change the world. Thousand Oaks, CA: Corwin Press.

Gareis, C. R., & Grant, L. W. (2015) Teacher-made assessments: How to connect curriculum, instruction, and student learning. Routledge.

McTighe, J. and Curtis, G. (2019). Leading modern learning, 2nd ed. Bloomington, IN: Corwin Press.

McTighe, J., Doubet, K., & Carbaugh, E. (2020). Designing authentic performance tasks and projects: Tools for meaningful learning and assessment. Alexandria, VA: ASCD.

McTighe, J. & Silver, H. (2020). Teaching for deeper learning: Tools for engaging students in meaning making. Alexandria, VA: ASCD.

McTighe, J. (2013). Core learning: Assessing what matters most. Midvale, UT: School Improvement Network.

Mehta., J. & Fine, S. (2019). In search of deeper learning: The quest to remake the American high school. Cambridge, MA: Harvard University Press.

National Research Council. (2012). Education for life and work. Washington, DC: National Academies Press.

Webb, N. L., Alt, M., Ely, R., & Vesperman, B. (2005). Webb alignment Tool (WAT): Training manual 1.1. Madison, WI: Wisconsin Center of Education Research, University of Wisconsin-Madison.

Wiggins, G., & McTighe, J. (2012). The understanding by design guide to advanced concepts in creating and reviewing units. Alexandria, VA: ASCD. 

Wren, D. & Gareis, C. (2019). Assessing deeper learning: Developing, implementing, and scoring performance tasks. Lanham, MD: Rowman Littlefield.

Jay McTighe has a varied career in education. He served as director of the Maryland Assessment Consortium, a collaboration of school districts working to develop and share formative performance assessments and helped lead standards-based reforms at the Maryland State Department of Education. Prior to that, he helped lead Maryland’s standards-based reforms, including the development of performance-based statewide assessments.

Well known for his work with thinking skills, McTighe has coordinated statewide efforts to develop instructional strategies, curriculum models, and assessment procedures for improving the quality of student thinking. He has extensive experience as a classroom teacher, resource specialist, program coordinator, and in professional development, as a regular speaker at national, state, and district conferences and workshops.

McTighe is an accomplished author, having coauthored more than a dozen books, including the award-winning and best-selling Understanding by Design® series with Grant Wiggins. He has written more than 50 articles and book chapters and has been published in leading journals, including Educational Leadership (ASCD) and Education Week.

UNDERSTANDING BY DESIGN® and UbD® are registered trademarks of Backward Design, LLC used under license.

 

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Related Articles
View all
undefined
Assessment
A Protocol for Teaching Up in Daily Instruction
Kristina Doubet
3 months ago

undefined
The Unwinnable Battle Over Minimum Grades
Thomas R. Guskey & Douglas Fisher et al.
3 months ago

undefined
Checking for Understanding
Douglas Fisher & Nancy Frey
4 months ago

undefined
The Fundamentals of Formative Assessment
Paul Emerich France
4 months ago

undefined
The Value of Descriptive, Multi-Level Rubrics
Jay McTighe & Susan M. Brookhart et al.
10 months ago
Related Articles
A Protocol for Teaching Up in Daily Instruction
Kristina Doubet
3 months ago

The Unwinnable Battle Over Minimum Grades
Thomas R. Guskey & Douglas Fisher et al.
3 months ago

Checking for Understanding
Douglas Fisher & Nancy Frey
4 months ago

The Fundamentals of Formative Assessment
Paul Emerich France
4 months ago

The Value of Descriptive, Multi-Level Rubrics
Jay McTighe & Susan M. Brookhart et al.
10 months ago