Measuring Connections in Career-Connected Learning

Key Points

  • Future Focused Education and Big Picture Learning are proving it’s possible to understand the social side of work-based learning. 

  • Both organizations’ commitment to equity has led them down that path.

FFE

By: Julia Freeland Fisher

In recent years, bipartisan support for career-connected learning has blossomed among education advocates, policymakers, and philanthropists. With it, a growing supply of work experiences, like internships, are on offer to high school students.

But that enthusiasm has fallen into an all-too-common trap in education reform: indexing progress based on inputs, rather than outcomes.

Pathways or dead ends?

Career education policies are partly to blame. Under the Perkins Act, for example, states report the number and percentage of students participating in work-based learning. Access indicators say little about the skills or connections students are (or aren’t) developing through a given experience.

At the same time, schools focus far more energy on the logistics of implementation than on outcomes, managing a dizzying array of scheduling, forging employer partnerships, and documenting credit hours. That leaves little room for considering how students are faring. “It’s pretty rare for schools to have the staff capacity to support caring about quality,” said Wilson Platt, a product manager for Big Picture Learning’s internship management platform, ImBlaze. When Platt’s team partners with school districts, headcount is the most common indicator schools track.

A recent Bellwether report, “Expanding Opportunity”, confirmed why leaders shouldn’t count on headcount. Program quality, even in states heralded for expanding their career pathways, remains a veritable mystery:

“One administrator in Texas said that the lack of data about program outcomes in their district meant that it took them years to realize that a prominent pathways program for military services was not actually preparing students for the careers the district intended. A former state policymaker in Ohio explained that ineffective programs can grow alongside effective ones because there is such little data about which programs are effective and which are not. Advocates in Colorado shared frustrations about the inefficient and often conflicting data systems used by different state and local agencies, which monitor pathways program outcomes in a piecemeal fashion that does not yield useful information.”

Anecdotes like these are the canary in the coal mine. Without measuring efficacy alongside scaling access, career pathways will be yet another failed, expensive edu-fad marked by good intentions and hollow outcomes.

Measure connections to close opportunity gaps

Luckily, a number of initiatives are bringing greater rigor to measuring the durable skills and industry credentials students are gaining through work-based learning.

That’s one important piece of the quality picture. But it doesn’t tell the whole story.

Social capital, not just skills, is a leading predictor of economic mobility. An estimated half of college internships and full-time jobs come through networks. To gauge quality, schools must ask: are interns forming connections who can serve as role models, channels to job information, and references down the line?

FFE

Measuring networks–alongside skills–will be the differentiator between good-enough initiatives for some students and those that level the economic playing field for all students.

An updated report we released this month, The Missing Metrics: Emerging Practices for Measuring Students’ Relationships and Networks, describes strategies to measure not just career know-how, but also know-who. The report offers sample survey questions from the field, including from two nonprofits in the high school internship space, Future Focused Education (FFE) and Big Picture Learning.

A clearer picture of students’ social capital

A few years ago, FFE set out to understand whether the interns they support were growing their social capital. Their survey used what sociologists call “name generator” questions, prompting students to consider the number of people they’d met during their experience:

Without measuring efficacy alongside scaling access, career pathways will be yet another failed, expensive edu-fad marked by good intentions and hollow outcomes.

Julia Freeland Fisher

“When you think of all the people that you worked with during your internship experience, how many could you tell us about? Please consider including teachers or staff, internship mentors, internship coworkers, X3 or NeXt Coach, and peer interns.”  

From there, students were asked to describe the quality of each connection, including:

“They helped me learn from setbacks; They worked with me to solve problems and reach goals; They inspired me to see possibilities in my future; They introduced me to people who can help me learn and grow” on a scale from “almost never true” to “often true” (these items draw heavily on the Search Institute’s developmental relationships framework).

Finally, students were asked about the comfort and durability of each connection: “How likely would you be to ask this person for help with your career in the future?”

The resulting data painted a multi-dimensional portrait of interns’ networks. It also gave valuable information in aggregate: on average, interns reported forming 3.3 developmental relationships. And greater numbers of new relationships were statistically related to greater overall mastery of skills.

Unlocking formative relationship data

FFE’s is the most comprehensive measure of interns’ networks that we’ve found in the field to date. But it isn’t without drawbacks. Collecting social network data is time-intensive, crowding post-internship surveys.

That challenge highlights one upside of Big Picture Learning’s internship management platform, ImBlaze. The platform includes a “dynamic survey tool,” allowing schools to ask interns and supervisors brief questions throughout the day, week, or semester.

One school, Ken-Ton Big Picture, used the functionality to ask students 3 questions repeatedly over the course of a year gauging their comfort, level of connection to their on-site mentor, and likelihood to work in the career area. 85% of their students felt ‘very comfortable’ and ‘well connected’ at some point in their experience. “That then led them to wonder about the 15% that never reached that point,” explained Platt.

A dynamic approach can mitigate lengthy surveys and also yield more reliable, actionable relationship data. “For example, an internship advisor can do a quick glance at the data at the end of the day. If a student says they were uncomfortable at their work site, that advisor can check in and problem solve,” said Platt.  

Both FFE and Big Picture Learning are proving it’s possible to understand the social side of work-based learning. Both organizations’ commitment to equity has led them down that path.

More education leaders should share that commitment. When it comes to traditional academics, measuring inputs alone would hardly pass the sniff test. “You’d never proudly say that 100 of your 150 students took a math class this year. But that’s what many do when it comes to internships,” said Platt. As career-connected learning expands at an impressive clip, schools must stop measuring participation alone, and start measuring career connections.

Julia Freeland Fisher is the Director of Education at Clayton Christensen Institute.

Guest Author

Getting Smart loves its varied and ranging staff of guest contributors. From edleaders, educators and students to business leaders, tech experts and researchers we are committed to finding diverse voices that highlight the cutting edge of learning.

Discover the latest in learning innovations

Sign up for our weekly newsletter.

0 Comments

Leave a Comment

Your email address will not be published. All fields are required.