Expanding Evidence Approaches for Learning in a Digital World is a new report from the EdTech Office at the Department of Education. “The report discusses the promise of sophisticated digital learning systems for collecting and analyzing very large amounts of fine-grained data (“big data”) as users interact with the systems,” said Karen Cator, the outgoing EdTech Director. “It proposes that this data can be used by developers and researchers to improve these learning systems and strive to discover more about how people learn. It discusses the potential of developing more sophisticated ways of measuring what learners know and adaptive systems that can personalize learners’ experiences.”
With the National EdTech Plan, the report bookends Cator’s energetic contribution. A former Apple exec, Cator “would like to manage a smooth transition to the next director” and will probably return to the private sector in the first quarter.
The report suggests that “evolved thinking” about edtech and evidence requires:
- Digital learning innovations that can be developed and implemented quickly so every school has the chance to adopt them
- Continuous improvement processes to improve, adapt, and enhance these innovations as experience is gained in using them
- Use of the vast amounts of data that sophisticated digital learning systems collect in real time to ask and answer questions about how individual learners learn so the needs of every student can be met; and
- Expanded approaches to evidence gathering for greater conﬁdence that investments in cost-effective and cost-saving technology-based interventions are wise, producing the outcomes sought.
The 100 page report was compiled by a team led by Barbara Means of SRI and was based on advice from a long list of academic advisors. The giant lit review is well organized but not as forward leaning as it would have been if they had included some technologists and entrepreneurs.
The report is organized around five evidence-gathering approaches and opportunities.
1. Promote deeper learning . Design-based implementation research (DBIR) is an emerging research approach better suited than traditional methods to investigate how digital learning is used in different contexts and how that affects outcomes. “A hoped-for beneﬁt of DBIR collaborations is that education practitioners will think about their activities as cycles of implementation, data collection, reﬂection, and reﬁnement and constantly seek data and information to reﬁne their practice.” (For more on this topic see the Getting Smart report,How Digital Learning Contributes to Deeper Learning .)
2. Promote personalized learning. Adaptive systems that “combine insights from learning theory…with large sets of detailed learning data…make the long-sought goal of differentiating instruction for every learner much more attainable.”
The report offers a useful definition of personalization: “instruction that is paced to learning needs, tailored to learning preferences, and tailored to the speciﬁc interests of different learners. In an environment that is fully personalized, the learning objectives and content as well as the method and pace may all vary. Thus, personalization encompasses differentiation and individualization.”
“Capabilities now available in newer and more sophisticated digital learning systems include:
- Dynamically updated ﬁne-grained modeling of learner knowledge that can be compared to a knowledge model of the concepts to be learned
- Licro-level tagging of instructional content, along with micro-level capture of learner actions within adaptive systems; and
- Adaptations based on students’ emotional states and levels of motivation.”
The report also points out that “relatively simple technology supports can also be
used to help classroom teachers dynamically adapt their instructional methods” and highlights the New Visions created the Ninth Grade Tracker and the College Readiness Tracker.
3. Promote responsive supports. This section acknowledges that schools 1) need to be more responsive to students’ needs and interests, 2) appreciate that learning happens in a wide range of settings, and 3) attend to the multiple aspects of their well-being.
The report suggests combining data from a variety of sources to “develop early warning systems for predicting student risks” and highlights ClassDojo, “a real-time behavior management tool…which creates reports for each individual student.. [which] can be emailed to students and used as the basis for conversations with students and their parents
to explore how behavior can be improved.”
4. Improving content and assessment. Evidence-centered design and content-embedded assessment make it easier to measure important features reliably and efﬁciently.
The report is bullish on embedding assessments in learning experiences as well as “the promise of automating the scoring of complex performances, addressing issues of cost and reliability at the same time. A case in point is the automated scoring of student essays” (aproject I co-directed).
Emerging strategies for gathering evidence include “data mining and learning
analytics techniques to analyze learner log ﬁles from digital learning systems”
The Department has prepared a brief on grit, tenacity, and perseverance (to be released this month) that argues these dispositions “are teachable and made up of three components: academic mindsets (cognitive framings that support perseverance), effortful self-control, and
strategies and tactics (such as adaptation).”
5. Finding resources and making choices . It’s becoming easier to produce learning content and common standards are making it easier to share. The report cited examples of pulling together learning resources from multiple sources like LearnZillion andGooruLearning but left out the biggest one–Edmodo.
The report reviews strategies for collecting evidence about learning experiences including aggregating user actions and reviews, user panels, expert reviews, and test beds.
Recommendations. The paper makes 14 proposals that are rational but not very forward leaning; many are university based and require more government funding.
My favorite involves “Interdisciplinary teams of experts in educational data mining, learning analytics, and visual analytics should collaborate to design and implement research and evidence projects.” There is great promise in both university and enterprise settings for interdisciplinary approaches.
The data recommendation suggests that “Stakeholders who collect and maintain student data should participate in the implementation of technical processes and legal trust agreements that permit the sharing of data electronically and securely between institutions.” We recently pushed beyond this generalization with two specific recommendations in a DLN SmartSeries paper, Data Backpack: Portable Records and Learner Profiles :
- A thick gradebook of data should follow students from grade to grade and school to school to allow teachers to personalize learning from day one, and
- Parents and teachers should be able to manage a comprehensive profile to customize a learning pathway.
The paper recommends increased R&D funding for promoting noncognitive (or what David Conley calls metacognative) skills including communication, collaboration, leadership, persistence and self-regulation. We agree on the potential for Innovations in Social Emotional Learning and believe that Digital Learning Contributes to Deeper Learning .
With the shift to growth models, proliferation of individual learning pathways, and competency-based systems, one of the emerging evidence challenges of this decade is comparing academic progress from disparate data sets. Common standards helps, but American education is trapped in a dated view of comparability based on the same instrument (usually an end of year multiple choice test) given under the same conditions during the same window. We need to develop new methods to compare thousands of observations in super-gradebooks of standards-aligned data. The fact that comparability is not addressed is a major shortcoming of this paper.
Topics mentioned but given short shrift: micro tagging (key to comparability), motivational profiles (key to persistence), iterative development (key to quality and cost effectiveness).
This evidence framework is a competent contribution to the sector dialog. The objectives are sound and the examples are useful. It covers most of what it should. It signals that the next few years will be full of developments that will benefit millions of American students.