Educational fMRIs: Dynamic Pedagogy and Pedagogical Analysis in the Multimodal AI Era

My deepest convictions about educational assessment were shaped outside the classroom with my late wife of seven decades, Dr. Susan Gordon M.D., a pediatrician. I watched her treat a diagnosis of medical status as a starting point for understanding, never an endpoint of judgment or mere classification. Closing charts of medical status data,  she wanted to know about a child’s wider functional ecology (sleep, home stressors, supports); the relations among data points and among these data points and what was happening in the life of the child, teaching me that consequential assessments don’t arrive in sterile score reports. They occur when a caring human truly seeks to understand the relations among these indicators and the child’s functioning. 

Working alongside Dr. Else Haeussermann in the 1950s taught a parallel lesson: Elsa evaluated children’s status and functioning not to categorize them (e.g., “uneducable”), but to discover how they functioned. Under what conditions their functions improved! Both these clinical-mentors modeled a profound truth: a meaningful diagnosis carries the obligation to understand the person-in-context and to act based on that understanding.

American education has spent a century doing the opposite, treating the ‘symptom’ (a standardized test score) as the primary story arc. To build an effective, multimodal AI-powered education system, we must fundamentally repurpose our measurement systems, transitioning to Assessment in the Service of Learning, operationalized through Dynamic Pedagogy, and embracing human-centered Pedagogical Analysis.

Assessment in the Service of Learning: From Thermometers to Thermostats

For generations, prevailing assessments have tended to audit achievement rather than to cultivate it. Resembling backward-looking thermometers that measure temperature when it’s too late to adjust, traditional standardized tests are static snapshots, lagging indicators that do little to illuminate how students actually learn.

To drive continuous growth, we need forward-looking thermostats. Assessment in the Service of Learning acts as this embedded thermostat; constantly sensing, providing feedback, and adjusting conditions to keep learning on track.

This requires viewing teaching, learning, and assessment as inseparable. Assessment cannot remain an external audit interrupting teaching; it must become a transaction woven into the fabric of ipedagigy, conceived as assessment, teaching and learning!

Dynamic Pedagogy: Education’s “Functional Medicine”

If Assessment in the Service of Learning provides the feedback loop, Dynamic Pedagogy describes the instructional approach, adapting the learning environment to the insights revealed. 

We see this in functional medicine, where disease is not  an endpoint, but functions as  a process. Two patients might share a “diabetes” diagnosis, but the processes that led them there (and their necessary treatments) can be radically different.

Education remains stuck in an old medical model, letting diagnoses like ‘below grade level’ become destiny. Dynamic Pedagogy acts as a functional diagnostic, asking how a child functions and under what conditions they might thrive. Precision education requires treating human variance as an asset, not statistical ‘noise’ to be minimized. Embracing it means designing pedagogies that adapt to the learner, abandoning the myth of fixed potential because, as Dr. Pamela Cantor M.D. and Kate Felsen argue, context is causal. A test score reflects performance under momentary conditions, not a child’s ceiling. Change the context, and performance changes. Equity means deeply understanding each learner’s unique profile, not standardizing the yardstick.

Pedagogical Analysis and Analytics

To scale this in the multimodal AI era, we need Pedagogical Analysis (humanistic interpretation of context) and Analytics (quantitative detection of patterns). Together, they shift focus from static outcomes to the processes of progression.

A traditional test resembles a standard MRI; a still photograph at one frozen moment. Pedagogical Analysis strives to be an educational fMRI; a moving picture of learning. Instead of asking if an item is right, we ask: What strategies were attempted? Where did confusion arise? We seek to capture the fluidity, dynamism, and processes of being and becoming in the learner.

Historically, capturing this moving picture was prohibitively difficult. Today, multimodal AI-powered platforms have the potential to capture real-time problem-solving and engagement metrics, directly putting the tools to render industrial-era summative tests obsolete in our hands.

But here we must issue a caution. More data does not automatically equal more understanding. We must avoid the modern physician’s trap of devolving visits into data transactions. Analytics devoid of relational analysis are pedagogically impoverished. AI must serve the teacher-student partnership; transforming learning requires the synergy of algorithmic detection and human interpretation.

Conclusion

We have the science, technology, and moral imperative to redesign assessment. Returning to that kitchen table (closing the chart and looking at the child) the ultimate goal clarifies: The purpose of assessment, dynamic pedagogy, and pedagogical analysis is not measurement elegance, but human flourishing. Let us build a system worthy of human potential.

* This piece was written by Edmund W. Gordon with support from Eric M. Tucker. It’s written from Dr. Gordon’s perspective. Thanks to Eleanor Armour-Thomas for her groundbreaking work on Dynamic Pedagogy.

Edmund W. Gordon

Edmund W. Gordon is the John M. Musser Professor of Psychology, Emeritus at Yale University; Richard March Hoe Professor, Emeritus of Psychology and Education, at Teachers College, Columbia University; and Honorary President of the American Educational Research Association. He has served in leadership roles with the National Academy of Education, federal government, College Board, and ETS. He co-founded Head Start during the Johnson Administration.
Erick Tucker Advisory Board

Eric Tucker

Eric Tucker is President of the Study Group. He has served as CEO of Equity by Design, Cofounder and Superintendent of Brooklyn Laboratory Charter Schools (including the Edmund W. Gordon Brooklyn LAB School), Cofounder of Educating All Learners Alliance, Executive Director of InnovateEDU, Director at the Federal Reserve Bank of New York, and Chief Academic Officer and Cofounder of the National Association for Urban Debate Leagues. He has taught in Providence and Chicago.

Discover the latest in learning innovations

Sign up for our weekly newsletter.

0 Comments

Leave a Comment

Your email address will not be published. All fields are required.