Online assessments are at the heart of the promise of digital learning. Online assessments provide real time diagnostics that identify learning levels and gaps. Often embedded in learning experiences like games, online assessment provides instant feedback in reading, writing, math and other subjects. Online assessment will also improve test security—no more eraser-Gate. Online assessment will power the future of customized learning—the best chance we have to dramatically boost achievement levels.
Most states have started using some online assessments as part of their state testing program and they will finish the shift to fully online assessments in 2015. The two Race to the Top-funded state testing consortia–SMARTER Balanced Assessment Consortium (SBAC) and the Partnership for Assessment of Readiness for College and Careers (PARCC)—have both committed to using online assessment to make the next generation of state tests less expensive to administer with faster turnaround. Most importantly, developments in intelligent scoring have made it possible to include a significant amount of writing on these new tests as well as constructed response items and innovative performance tasks. The new tests will reflect the deeper learning aspirations of the Common Core State Standards.
In February, leading testing companies conducted a private demonstration showing very high levels of accuracy in scoring thousands of student essays (results will be made public on April 16). The Hewlett Foundation-funded Automated Student Assessment Prize (ASAP) has over 100 entrants; many are able to score essays with very high levels of agreement with expert graders. The $100,000 prize purse for the winners of the open competition will be awarded on May 9 in Washington DC.
The consortia have to this point remained silent about what kinds of devices their new test will run on. On one hand that implied flexibility but it is has also created some uncertainty—many states and districts are anxious to get definitive answers so they can plan accordingly to develop high access environments.
The big question is whether the consortia will allow testing on tablets. Many schools are using tablets to create high access environments and for them using the tablets in state testing would seem like the natural solution. However, tablets as a testing option introduces lots of tough pedagogical, technological and psychometric (and eventually legal) questions. Consortia staff are aware of these issues and have plans for addressing them through research and field testing.
For me, writing is the big issue. I want to see American kids writing a lot in classrooms and on state tests. I still think of tablets as consumption devices and don’t recommend them for high volume, high value production. So how would they work on state tests that require several essays? Where they are used, I suspect there will need to be several accommodations provided in tablet-centric districts including providing wireless keyboards for tablets, or providing access to laptops or computer labs for the writing component of the test. By 2015, touch screen writing may be better and widely used enough that this will be no big deal.
Comparability will be a challenge in a number of ways. What if some students have big screens and some have little screens? The ability to read a document and write a response side by side on one screen is highly desirable but hard to do on a tablet. What if some touch-enabled items were released and available to some kids but not others? If tablets are to be considered, the consortia will need to field-test all of these questions. It’s probably most important that students be tested in the same or similar environment to their regular learning environment.
Over the next five to ten years we need a better definition of comparability than ‘the same instrument given on same day under the same conditions’—a definition that allows multiple instruments and comparison of big data sets to inform the variety of jobs that we ask of tests. As I’ve said before, assessment will advance when it moves into the background.
But in the short run—the next 3 years—states and districts need to get ready for online assessment. One way or another I think the consortia will soon add some level of certainty around acceptable devices and there will be no excuse for not preparing to administer online assessments in 2015.
In January, PARCC and SBAC awarded a contract to Pearson to develop and administer, with the assistance of the State Educational Technology Directors Association (SETDA), a Technology Readiness Tool. SETDA director Doug Levin (who is all over this issue) recently provided a great overview of online assessment and the readiness survey. Bryan Bleil is leading the effort for Pearson. Here’s the Pearson site with everything you need to know about the survey and what you can do to get ready: www.PearsonAssessments.com/NextGenRoadmap.
While it will require some upgrades and additions, most districts already have enough computers to administer online assessment in a couple shifts (i.e., at least one computer for every three students), but the 2015 timeline does provide a perfect timeline to also make the shift to digital instructional materials. Like Florida, states should set a data not later than 2015 and begin planning the shift to online assessment and personal digital learning.
For more, see:
- How Intelligent Scoring Will Help Create an Intelligent System
- 5 Strategies to Deliver Edtech Access to Every Student
- Letter from New Hampshire chief explaining the Tech Readiness Tool to principals