I’m attending a Washington DC conference on Technology Enhanced Assessments hosted by the K12 Center at ETS and CCSSO.   This afternoon I’m speaking about tech-driven improvement in US K-12 assessment.   Advancements will occur in three phases over the next five years:

  1. Summative. Make SBAC and PARCC tests as good as planned (2012-14)
  2. Formative. Leverage improved access and classroom assessment (2013-15)
  3. Big Data. Use student profiles to power personalized learning (2014-16)

I spent most of the spring on a phase one effort: the Hewlett-funded Automated Student Assessment Prize (ASAP), an effort to support the consortia’s interest in incorporating writing and performance assessments.  We’re making an exciting announcement tomorrow about the winners of the $100,000 prize purse (read more about How Formative Assessment Supports Writing to Learn). I support the consortia plans but, as James Pellegrino said yesterday, “We should think of this as a ten year process with several phases.” While I appreciate the complexity of preparing for the 2014-15 advent of online assessment, this post will look just beyond those first generation tests to what will be possible in the second half of the decade. Multiple Shifts.  Assessment advances are just one of six shifts that most American schools will experience in this decade:

  1. Print to digital
  2. Flat & sequential to engaging & adaptive experiences
  3. Annual tests to instant feedback
  4. Cohorts to individual progress
  5. Individual teacher practice to teams
  6. School as place to learning as blended services

With improved access and more powerful tools, assessment is undergoing four shifts:

  1. Periodic to continuous: all day long not just end of unit/year
  2. Foreground to background: most assessment will be embedded within learning experiences, produced products, and observed behaviors
  3. Artificial to authentic: simulations and real work products rather than bubble sheets
  4. Heavyweight to lightweight: when every student has a huge standards-based gradebook, a simple sample will sufficient for summative assessment

These shifts will drive important developments in profiles, playlists, projects, and student progress models. Profiles. Comprehensive profiles will include:

  • Standards-based gradebooks that are auto-populated with experience-embedded assessment as well as teacher observations and rubric scored projects.  On an open-API platform, the gradebook will conduct regular data extracts from multiple learning apps.
  • Motivational profiles that use keystroke data to surface content that promotes persistence as well as performance.
  • Portfolio of artifacts: Assessment as Portfolio of Personal Bests.

A thick summary of this profile will make up a student ‘data backpack’—an electronic student record that travels with a student.  Parents will have the ability to share profile information with and invite contributions from other educational providers (e.g., after school, summer school, tutoring, test prep). Playlists. Smart engines will produce custom playlists that students can start at school and finish at home.  Including videos, learning games, simulations, and instructional units, playlists will be tailored to each student’s learning level and best learning modality. The best current examples of playlists are adaptive math products like Dreambox and Mangahigh and pilot projects like School of One.  PowerMyLearning and Gooru have grade level playlists that will probably be customized by predictive algorithm in the near future. Teachers will tailor playlists to help student meet learning targets and to prepare them to engage in interdisciplinary projects. Projects.  As tools improve, more schools are blending individual online skill development with rich interdisciplinary project-based learning (PBL).  PBLU from Buck Institute is a great place to start (and they’ll have a great app on Edmodo soon).  Project Foundry helps most Edvisions schools create standards-based projects and track demonstrated competencies.  Echo, from NewTechNetwork is a PBL LMS. Next gen tools will not only help teacher build customized standards-based rubrics but will use mobile, social, and augmented reality features to incorporate local assets: businesses, museums, historical sites. Progress will be based on demonstrated mastery.  Evidence will include background data automagically collected from digital learning experiences plus end of unit/course demonstrations.  Badges and other achievement recognition systems will help visualize learning progressions and will inform instructional choices. His videos are great but Khan’s Big Contribution Will Be Competency-Based Learning. He makes a great case for mastering content at a high level before moving on–a strong foundation is key to success at higher levels. As more systems manage student progress based on a series of micro-assessments and demonstrations, age cohorts and end of year tests will become less important for matriculation management. What The World Needs Now.  Love would be great, but I’d settle for comparability.  We’ve used a high bar for reliability and comparability in the past: ‘same instrument, same conditions.’ With the mountains of standards-aligned data headed our direction, I think we’ll be able draw inferences about academic productivity from big data sets even though the learning experiences they represent were completely different.  Common Core micro-standards that break standards in to smaller chunks than grade-level equivalents will be particularly useful here. It’s time for Data Quality Campaign 2.0, a new baseline for state data systems starting with a student ‘data backpack.’  As more colleges accept or even demand a portfolio of student work, a set of standards for storing and sharing artifacts would also be useful (see Pathbrite for a good start). Each of the three phases (described in opening) will mark an order of magnitude increase in data available for the five big questions:

  • How to improve learning?
  • Are staff members contributing to learning?
  • Have students learned enough to progress?
  • Is a school/program producing learning gains?
  • Are policies productive?

For the last 15 years we’ve tried to answer these with inexpensive bubble tests.  With far more data, states and districts will have several opportunities in this decade to update policies in each of these areas.  Advances in assessment, or more broadly performance data, will fundamentally change how schools are organized and operated.  And that will results in a dramatic increase in the percentage of students ready for college and careers.

3 COMMENTS

  1. Very comprehensive post. Don’t forget the Mozilla Badges as part of the artifacts in the portfolio.

    I am still weary of adding more online testing for so many things. I work in Oregon where online statewide testing has led students to calling computer labs, “testing labs” because the computers see very little instructional use.

    One-to-one programs will fix this, but Oregon schools cannot afford them in the foreseeable future…even in 10 years…unless basic funding increases.

    I would rather focus on technology for the “mass customization” of content idea and have paper bubble tests if there will be the same unintended consequences as we have seen in Oregon.

LEAVE A REPLY

Please enter your comment!
Please enter your name here