Evidence That Makes it Evident: Improving Assessment by Emulating the Trades

By Elliot Washor and Charles Mojkowski, Big Picture Learning

One positive outcome of current education reimagination and transformation initiatives is that educators are broadening and deepening their expectations of the graduates they need to produce. The traditional academic focus is now accompanied by other essentials—deep learning, social-emotional, workplace readiness, social capital, and similar sets of competencies (knowledge, skills, and dispositions) and accomplishments.

This expanded vision of the graduate has, however, widened the gap between the competencies educators (and employers, parents, and communities) want for all graduates and the evidence of competence educators are actually collecting. Schools are failing to deepen and broaden their assessments to include the new competencies, and are actually testing increasingly fewer dimensions of the graduate the workplace and their community will require.

For over twenty years, we at Big Picture Learning have railed against the myopic focus on the practice of a few narrow and shallow slices of evidence being used in educational assessment. We have complained about the artificial ways in which evidence of competence and accomplishment was gathered. We have protested that such meager data was used to hide rather than reveal each student as an individual learner. We were not alone in those protests, and were accompanied by many others as insightful and articulate as we aspired to be.

Nevertheless, the problems with assessing learners and learning persist, and may have metastasized despite the use of upgraded tools and technologies. Educators have embraced evidence as their latest refrain for judging whether schools, principals, and teachers are truly effective in improving student performance. The technicians continue to tweak testing and test score analysis, but the nature of the evidence and how it is obtained and employed are woefully inadequate to guide the preparation of the graduates we need.

The perennial questions persist. What learning should be assessed? How should learners and learning be assessed? How should assessment information be used to make decisions about learners and learning?

We are revisiting these questions in our current pilot test of a new form of Career and Technical Education (CTE) that is focused on three of the design principles at the core of all of our work: learning through interests; learning with, through, and from others; and deep and purposeful practice. That work, focused on the traditional and contemporary trades, has brought us back to assessment. What indicators and measures, instruments and methods, and data analysis methods align with those design principles?

In the new program, the Harbor Freight Fellows Initiative (HFFI), we identify young people with a deep interest and talent in a trade. At the heart of the program is a new form of apprenticing in traditional skilled trades pathways, as well as in non-traditional areas such as hands-on pathways in the sciences and technology.

HFFI blends exemplary CTE programs, apprenticing, and communities of practice into a new milieu where work is practical, relational, situational, and meaningful. Our mission is two-fold: to elevate the respect for, and support of, the trades as valued pathways for youth to achieve fulfilling adult lives equal to any academically rigorous program, and to help programs involve students in professional tradecraft communities.

We help these youths find mentors and coaches, who work with them to guide them in burnishing their competence. We employ a new form of apprenticing, inspired by the formal apprenticeship model but with fewer formalities. The students work closely with an expert in their trade in real-world settings. We connect their teachers to their mentors, so everyone is working together to ensure understanding of what the student can do and what new learning is appropriate.

These apprenticing experiences are at the midpoint between a typical high school internship experience and a traditional apprenticeship in a trade like plumbing or electrical. There is a heavy focus on competence, which is judged by performances and artifacts. Seldom will a traditional written test satisfy our need to guide improvement and certify competence. Instead, we are employing methods that enable experienced makers and fabricators to think deeply. What do expert makers and fabricators know that gives them immense confidence in solving problems in their real work?

We are giving much more attention to competencies that are highly valued in the workplace and community. Consider, for example, these two accomplishments, among several, that we have identified as important for the HFFI students. During their apprenticing experience, we expect that all HFFI students will have:

  • Established a network of diverse peer and adult mentors whom they engage in supporting both their learning and work, and in advancing their tradecraft, careers, and lifelong learning.
  • Developed a portfolio of learning and work artifacts, performances, and certifications that demonstrate competence and accomplishments.

For each of these accomplishments, we identify the competencies required to achieve the outcome. We then determine the evidence the students need to provide to demonstrate their proficiency and readiness to move on to other challenges. We help each student design a continuum of learning experiences that requires increasingly sophisticated student performances that demonstrate competence.

We are assessing these valuable competencies in real-world settings and contexts and engaging expert practitioners as partners in the assessment. We are contextualizing assessments and relating them to each student’s interests as they work within communities of practice around their trades.

But we are not only accounting for competencies we value–we are assessing student performances and artifacts that demonstrate understanding. Of course, we continue to assess basic literacy and numeracy, abstract reasoning, and core knowledge, but always within the larger context of a broader and deeper set of valued accomplishments that give the disciplines an authentic fit in the trade. Moreover, we assess not just know-what and know-how, but also “know-who,” the development of relationships and social capital in real-world contexts and settings.

Colleague Frank Wilson, a retired professor of neuroscience at Stanford and author of The Hand, provides, in a personal communication, the rationale for our approach: “The maker of any object has no choice but to observe the results of her own actions at every step of the way, until the object has been completed… A written test created by someone not concerned with the real-time endeavors (the sensory motor and perceptual immediacy) of real people doing real jobs rarely bears any useful relationship to the exercise of professional judgment and skill, and even more rarely teaches people how to integrate their own bodily senses and perceptual skills into their professional (or even their daily life) thinking.”

What can we learn from forms of authentic and organic assessment that have been practiced in other areas of education? Two examples come to mind: 1) the performance assessments conducted in many CTE programs and apprenticeships, and 2) the self-and peer assessments that craftsman employ as a part of their making, individually and part of a team or community. Such assessments are hand-and-mind, body-and-soul endeavors. In Ascent of Man, Jacob Bronowski reminds us, “the hand is the cutting edge of the mind.” And Wilson adds, “You can’t talk about human intelligence without talking about the hand.”

Take, for example, the assessments of an auto technician or a nurse, where the learner must diagnose a problem, alone and in discussions with others, and propose solutions. Or consider the production of a cabinet or desk or a piece of glassware. Here, the knowledge is embedded in the skill. Learning is deepened through fabricating, through an ongoing assessment “dialogue,” or “correspondence,” as Scottish anthropologist Tim Ingold describes the relationship between the maker and the object or the performer and the performance. Think of the architect designing a building or space, the doctor conducting a complex surgery, the auto technician diagnosing an emissions problem, and the lawyer preparing a brief. In each, the “performer” is thinking through doing and doing through thinking, formulating and testing heuristics for addressing challenges in a system full of assessment checks and balances where self-assessment is balanced with judgments from a community of assessors—peers and expert practitioners as well as traditional teachers.

These genuine types of self-assessments and group decisions evolve over time. There is no way the student can pass them without knowing that they passed. The student is connecting people, objects, and places in settings (POPS) that give them the immense confidence to solve new problems. Students need time to practice, time to stop and think, to put things aside and get back to them, and time to discuss the problems with others. This is a natural way of learning and performing—and of assessing.

To support this way of knowing, we are attempting to create and use “instruments as good as our eyes,” assessments in which the learning is self-evident and utterly transparent to the learner and others. Such work will close the gap between the expectations we have for our graduates and the evidence we collect about those expectations.

For more information on this work, please visit NavigatingOurWay.com, HarborFreightFellows.org and BigPicture.org, or reach out to us directly at [email protected].

 For more on CTE and assessment, see:

Elliot Washor, Ed.D. has been involved in education reform and school design for more than 35 years as a teacher, principal, administrator, designer, video producer, speaker and writer. 21 years ago, Elliot co-founded Big Picture Learning and the Met Center in Providence, RI. Connect with him on Twitter @Elliot_Washor, or by email [email protected].

Charles Mojkowski is an independent consultant in the areas of school and curriculum improvement, leadership and organizational development, and applications of technology that support that work. He has worked with Big Picture Learning since 1995. Connect with him on Twitter @ChazMojkowski. 

Stay in-the-know with all things EdTech and innovations in learning by signing up to receive our weekly newsletter, Smart Update.

Guest Author

Getting Smart loves its varied and ranging staff of guest contributors. From edleaders, educators and students to business leaders, tech experts and researchers we are committed to finding diverse voices that highlight the cutting edge of learning.

Discover the latest in learning innovations

Sign up for our weekly newsletter.


Leave a Comment

Your email address will not be published. All fields are required.