By Karen Cator, Director of the Office of Educational Technology, U.S. Department of Education
Have you noticed lately that MOOCs are all over the news? It’s hard to imagine that just a year ago, most people had never heard of Massive Open Online Courses—courses that hundreds of thousands of people all over the world take online, free of charge and that are rapidly growing in number. With this kind of opportunity comes the responsibility to ensure that these and other learning resources are quickly and continuously improved based on the best data available. Luckily, more and better data is emerging as digital learning becomes commonplace.
Change happens big in technology and it happens fast. And when public money is being spent and students’ futures are at stake, it is crucial that changes also happen smart. Our new report, Expanding Evidence Approaches for Learning in a Digital World (http://www.ed.gov/edblogs/technology/evidence-‐framework/), calls for smart change by presenting educators, policymakers, and funders with an expanded view of evidence approaches and sources of data that can help them with decision-‐making about learning resources.
The report discusses the promise of sophisticated digital learning systems for collecting and analyzing very large amounts of fine-‐ grained data (“big data”) as users interact with the systems. It proposes that this data can be used by developers and researchers to improve these learning systems and strive to discover more about how people learn. It discusses the potential of developing more sophisticated ways of measuring what learners know and adaptive systems that can personalize learners’ experiences.
The report describes an iterative R&D process, with rapid design cycles and built-‐in feedback loops—one familiar in industry but less so in education (however, the report provides numerous examples of applications in education). An iterative R&D process enables early-‐ stage innovations to be rapidly deployed, widely adopted, and— through continuous improvement processes—refined and enhanced over time. This means that data collection and analysis can occur continuously and that users are integral to the improvement process.
The report encourages learning technology developers, researchers, and educators to collaborate with and learn from one another as a means of accelerating progress and ensuring innovation in education.
In the spirit of an iterative development process, we are posting this report for public comment. Does the report resonate with your view of the emerging digital learning landscape and the data? Do you have examples of evidence gathering methods that use emerging data? Are the recommendations the right ones for enabling progress? Do you have other thoughts and ideas on the topic of data, evidence and digital learning? We would like to hear from you!
Thanks to our Technical Working Group and Expert Advisors
This report was developed collaboratively, in partnership with a Technical Working Group of learning technologies experts. We wish to thank Eva L. Baker (University of California, Los Angeles), Allan Collins (Northwestern University), Chris Dede (Harvard University), Adam Gamoran (University of Wisconsin), Kenji Hakuta (StanfordUniversity), Anthony E. Kelly (George Mason University), Kenneth R. Koedinger (Carnegie Mellon University), David Niemi (Kaplan, Inc. ), James Pellegrino(University of Illinois, Chicago), William R. Penuel (University of Colorado, Boulder), Zoran Popović (University of Washington),
Steve Ritter (Carnegie Learning), Russell W. Rumberger (University of California, Santa Barbara), Russell Shilling (Department of Defense, United States Navy), Marshall S. Smith (The Carnegie Foundation for the Advancement of Teaching) and Phoenix Wang (William Penn Foundation).