Creating a Data Driven Classroom

Data Driven instruction can often feel like a classic education buzz-word. We know data is important and it is becoming increasingly more available, but how do we effectively integrate into daily practice? How do we go beyond simply identifying its importance to really utilizing it to create a more individualized and student centered learning environment?

We believe in the Student Data Backpack and Expanded Learning Profile for enhancing the potential of student data to personalize learning and provide a path to protecting student information. Student data is not just a huge problem to solve, but a huge opportunity to seize.

This blog first appeared on Getting Smart Advocacy Partner, DreamBox Learning’s Blog. Authored by DreamBox Superintendent in Residence, Dr. Gregory Firn, this is the first in a three part series focusing on revolutionizing teaching and learning through the use of data driven instruction.


Dr. Gregory Firn

As an educator, I’m both sensitive and guarded in any discussions about data. The use and abuse of data has created a sense of cynicism and skepticism in educators that, for the most part, leaves many with a completely tainted point of view. Simply put, data is seen as a four-letter word. However, a data-driven culture that nurtures and continuously examines learning processes and outcomes—rather than one that’s used as a hammer or agent to malign, degrade, or blame students or teachers—is necessary for personalized learning and to revolutionize instruction in the classroom.

As a school leader, I worked with a team of educators at the district level to strategize and ultimately help build a new culture surrounding the way we talked about and used data-driven instruction in the classroom. We remained focused in our efforts to mitigate objections, obstacles, and an overwhelming reluctance to using data to inform instructional decisions and the progress of students in meeting or exceeding learning goals. We devised a great plan and experienced initial growth and improvement, but plateaued with results becoming flat, or in some cases, declining. I watched staff struggle with making sense of student performance data; albeit End of Grade or End of Course assessment results as well as “Value-Added” predictive data. I experienced firsthand the frustration of classroom teachers who sincerely and with great effort examined benchmark and intermittent assessment data with the best of intentions, only to become discouraged.

Here, I’ll discuss the three underlying trends that emerged, which provide context to our lack of sustained improvement and explain the frustration that teachers, school leaders, and central office staff experienced.

Trend One: Trailing—Not Leading—Indicators of Learning

Like most assessment systems in place today, ours was built on summative data sets and analytics, and on providing increasingly easier access to data for teachers and school administrators. Our staff found, however, the presentation of data as well as reporting protocols to be cumbersome. Though the amount and breadth of data was vast, the insights critical to adjust instruction that our teachers needed and wanted in order to address the failure to learn was nonexistent. The availability of powerful reporting protocols and dashboards that showed evidence or lack of learning are helpful and needed, however, they fall short in providing critical insight during the learning process. Assessment systems are based almost exclusively on what a learner has already learned (trailing after the fact) and not how they are learning (leading during the process).

I soon realized that the trailing indicator data was not designed or capable of doing what we hoped to use it for. We focused significantly on developing our staff’s understanding of leading indicators, ones that teachers could observe, monitor, and influence throughout the learning process.  The challenge that eventually created frustration for our staff was the inability of our assessment system to provide the data they needed. Additionally, certain realities emerged that created further doubt surrounding the credibility of the data. For example, the data did not represent prior learning experiences or insights about the learner, nor indicate or personalize how the learner constructed meaning or conceptualized understanding. Although technology has significantly improved the capacity for data collection, storage, and analysis for large and discrete sets of data, it is limited and almost exclusively attributed to the decision of what data are selected for input.

Trend Two: Intensity of Time

Preparing, analyzing, and interpreting data is extremely time intensive. Even more so is the time necessary to gain insights leading to data-driven instructional decisions.

I made the mistake of overlooking what may seem obvious: the comfortableness, confidence, and competence of staff to work with and use data. Instead, I found a significant gap in data and assessment understanding. However, capacity is fundamental to the quality of discussions and decisions about instructional practices, policies, and improvement initiatives.

Assessment literacy became an intensive focus and we committed resources, including vast amounts of time to raising awareness of assessments and their corresponding data. We modified instructional days and examined other time factors to better understand the data. The challenge was in finding the time necessary to glean insights, making coherence and actionable strategies problematic. Initially, staff became proficient in identifying data through “failed” learning and therefore determined the steps to respond, react, remedy, or reteach a particular set of content. Yet in most cases, the data—although reported at the individual learner level—was not actionable by the teacher due to the inability of the data to reveal just where in the learning process the student failed to create, evaluate, analyze, actualize, apply, understand, or connect. A shift from reacting to this kind of “failed” learning to preventing it was needed, and it required a different mindset, as well as fewer time constraints in the classroom.

Trend Three: Technology

Closely related to the limitations of the assessment system to collect and monitor leading indicators and the intensity of time was the access, reliability, dependability, and utility of technology. I discovered several members of staff who were inexperienced with regard to digital tools, devices, and programs. Therefore, digital literacy for the purpose of building instructional competence and confidence became vital.

I’ve long been convinced that technology holds two capabilities with respect to instructional decision-making. The first is in the immediacy of feedback on the effect of an instructional strategy or lesson on learning. The closer we get to real-time feedback, the closer we get to the authentic ability for instructional decisions to be based on insight, actionable intelligence, and individual information. The second capability of technology involves the empowerment of both the teacher and learner through providing relevant feedback during the process of learning—not just at the end. The immediacy of feedback, and the variety of form, quality, and context of the feedback determine its effect on the learner as well as the teacher. The ability to monitor the construction of learning, the formation of conceptual awareness and understanding, and watching a student make meaning while learning is a powerful notion. Moreover, imagine the insight for teachers and their ability to make instructional decisions in real time to authentically differentiate personalized instruction and learning!

My experience as a school leader in using data to drive transformative change created powerful learning for myself as well as those I was privileged to lead. That learning resulted in incredible outcomes. But as good as our results were, we failed to achieve scale due in part to the three aforementioned trends. Now more than ever, I’m convinced that what were once constraints will become possibilities—the ability to leverage data in the form of leading indicators of learning, leveraging time differently, and leveraging technology to inform, influence, and impact teaching and learning in ways only before thought as idealistic or impractical.

For more on DreamBox Learning, check out:

greg-firn-dreambox-75x75
Superintendent in Residence at DreamBox Learning. Follow Dr. Firn (@BestofClass) and DreamBox Learning (@DreamBox_Learn) on Twitter.

Guest Author

Getting Smart loves its varied and ranging staff of guest contributors. From edleaders, educators and students to business leaders, tech experts and researchers we are committed to finding diverse voices that highlight the cutting edge of learning.

Discover the latest in learning innovations

Sign up for our weekly newsletter.

0 Comments

Leave a Comment

Your email address will not be published. All fields are required.