Zip Code Boundaries for Online Learning Defeat the Purpose
Creating zip code boundaries around virtual opportunities sort of defeats the purpose, right? That’s just one of the recommendations from NEPC in their annual plea to block progress. Actually, they were so fond of geo-limits that they mentioned it twice.
The other 14 recommendations aren’t all that bad, 5 are common in state policy, 7 hold promise for improving online schools, but a couple like geographic boundaries just don’t make sense (unless, of course, their objective is simply to block online learning).
The magical thing about digital learning is that anyone can learn just about anything anywhere. Family learning opportunities are no longer limited by zip code. Anywhere-anytime-learning resources have been available for 20 years and for a decade there’s been plenty of capacity to offer every high school student in America access to a great college prep high school experience including every AP class, a dozen world languages, and hundreds of electives. All of these classes should be available to every student, any time, at no cost, on a full or part time basis. States would actually save money and benefit from higher college completion rates by simply ensuring that every student had access to quality learning opportunities.
That’s why limiting online learning to zip codes or curtailing rather than expanding access is exactly the wrong thing to do. Instead, states should expand part time online enrollment–course choice–and should build reciprocal agreements with other states so that a great physics teacher can serve a thousand students nationwide rather than 100 from his zip code.
NEPC has a couple common sense ideas like more research–bring it on! They also suggest new outcome measures for virtual schools–actually, we need better growth measures for all schools so that we can easily compare progress in different environments.
They are also right that we could use better accountability measures–especially for all the virtual schools that take on the challenge of over-aged and under-credited kids seeking a last chance at a high school diploma.
The report suggests developing rubrics for online teacher evaluations. They should just call the three largest providers–they all have well developed evaluation systems.
The worst idea was to “stop growth”…but that was the point of the report.
I’m a director of the International Association for K-12 Online Learning (iNACOL) and support the organization’s interest in quality assurance. We can and should expand access to high quality, cost effective, full and part time online options for students and families.
Tom, as one of the authors of this report, I'd like to take issue with one of your points. You indicate that the report recommends that we should "stop growth," which isn't accurate. What we do recommend is that we slow growth and manage it in a responsible way. One of the things that the history of K-12 online learning has showed us, particularly in the full-time online schools sector, is that jurisdictions where we have seen significant growth these are the same jurisdictions where programs have had problems with student performance. However, we have also seen - both from experience and research - that programs that have followed a managed growth model, either by program plan or because of regulation, those programs have much fewer problems with student performance. I actually find it a little ironic that you claim to want high quality learning - in fact it is included in several of the principles of your Digital Learning Now initiative, but yet you continue to advocate for policies that have failed to achieve high quality learning for even the majority of students who are enrolled in these "opportunities."
Tom Vander Ark
The "performance problems" you reference are often more reflective of the schools that students left than the schools the attended during testing. Here's one version of the problem: a student falls behind, gets in academic trouble, leaves school, and looks for alternative pathways. They find a virtual school, enroll late, make some progress, and then take a state test--surprise, they are three years behind. This may be the case despite making 1.2 years of progress during the six months they were enrolled before testing (according to widely used pre-post adaptive tests).
State growth models are a crude estimate of cohort growth using outdated tools. That's why for me (and iNACOL) better and comparable measures of student growth are essential to understand what kind of "performance problem" we're dealing with. I'm all in favor of rigorous authorizing and strong accountability (for all schools/providers), but it would help to work with better data.
Tom, that is a circular argument - and one that impacts brick-and-mortar schools in jurisdictions that have full-time online schools more than the other way around. With what the research (and the individual state reporting data) tell us about retention in full-time online schools, there are many more students leaving these full-time online schools mid-semester - well behind - and joining brick-and-mortar schools than the other way around. This is well documented in the Arizona and Colorado data that has been published! But it is a convenient excuse that your side often espouses.
Leave a Comment
Your email address will not be published. All fields are required.