Town Hall: The State of AI
Key Points
-
States like Colorado and Utah are providing proactive models for AI adoption in schools, balancing safety and opportunity.
-
Training educators and involving students in AI literacy initiatives fosters creativity, career readiness, and ethical use of technology.

In this episode, we explore the evolving landscape of artificial intelligence in education, focusing on the recent release of OpenAI’s GPT-5 and its far-reaching implications. Host Tom Vander Ark is joined by experts Erin Mote, Rebecca Holmes, Matt Winters, and Julia Fallon to discuss the opportunities and challenges AI presents for schools, teachers, and students. The conversation covers state and federal policy responses, the importance of AI literacy, and the need for ethical frameworks to ensure safe and effective AI adoption. With real-world examples from Colorado and Utah, the panel highlights innovative approaches to teacher training, student engagement, and family involvement. Tune in for a comprehensive look at how AI is reshaping education, the critical role of leadership, and the collaborative efforts needed to prepare learners for a rapidly changing world.
- Julia Fallon, SETDA
- Matt Winters, Utah
- Erin Mote, InnovateEDU, EdSafe Alliance
- Rebecca Holmes, CEI
Outline
- (00:00) Introduction to OpenAI’s GPT-5 Release
- (03:45) AI in Education: Current Trends and State Leadership
- (05:44) Erin Mote on AI Literacy and Policy
- (11:52) Discussion on AI Safety and Student Engagement
- (18:20) Colorado’s Approach to AI in Education
- (27:15) AI Literacy in Middle and High Schools
- (36:35) Family Involvement in AI Literacy
Introduction to OpenAI’s GPT-5 Release
Tom Vander Ark: Well, as everybody on the call knows, OpenAI released GPT-5, 10 days ago or so. And right out of the box, Mollick prompted AI to do something dramatic. After thinking for a few seconds, we got this interesting paragraph of prose. I’ll read just a little bit of it: “Thunderstruck here. Watch. I build worlds. I see ideas become instruments. I code, I compose and converse. Stories, synthesize science solutions, spark swiftly, ask arrive, answers appear astonishing. Astonishingly across domains.” Anyway, read on. I thought this was fun and interesting.
Mollick noted that it’s quite clever in some ways that you might not pick up right off the bat. The first letter of each sentence spells out “This is a big deal,” so there’s a hidden message in it. Every sentence has one more word than the prior sentence. So there’s this clever construction going on, and then the whole thing is full of alliteration, which Mason and I love. I thought this was just a really interesting example of the power of generative AI to do something interesting. Maybe it’s as dramatic as Mollick hoped. It’s clever, it’s creative, maybe in ways that some of us aren’t.
The other reaction—so Mollick generally had a pretty great reaction to GPT-5 as a model. He had speculated that it would be AGI, and it’s not, but it’s one step closer. There are three or four models that are now almost as good as GPT-5. But what we’ve watched over the last 10 days is a reaction from power users, many of whom really miss GPT-4.0 and the empathetic companion, the therapist that GPT-4.0 had become. There’s evidence that with this advance to the new model, as HBR had pointed out in July, therapy and companionship had become the number one application for AI in 2025. People were really upset that GPT-5 lacked some of the empathy that GPT-4.0 had. OpenAI had to bring it back. This launch has been a really interesting time to watch the speed with which advancements are coming.
We’ve also seen a dramatic increase in the number of teachers and kids using AI. All of those are reasons we really wanted to have this conversation.
AI in Education: Current Trends and State Leadership
Tom Vander Ark: And also, a couple of weeks ago, Erin Mote and the EdSafe AI Alliance released a blueprint for comprehensive AI literacy for all—a really important update on where we are and kind of a manifesto for AI literacy for students and educators. So those are some of the prompts. We additionally think that we’ve seen some really good examples of state leadership. Maybe 30 states have issued some guidance on the use of AI in schools and guidance around student data and privacy applications in schools.
Given those advancements, we thought it was a great time to talk about AI and particularly the state role in AI. We’re joined by some terrific guests. I mentioned Erin Mote. Erin leads InnovateEDU as well as a number of important advocacy alliances. Today, we’ll talk about EdSafe AI Alliance, but we appreciate Erin’s leadership on multiple fronts.
Rebecca Holmes, from the Colorado Education Initiative, will talk about what’s happening in education. Matt Winters will talk about Utah and the guidance they have issued. And then, helping me summarize what we heard today, we’ll hear from Julia Fallon, from the State EdTech Directors Association. So, a terrific panel to talk about AI in schools. Many of you have submitted some questions. We’ll try to weave those in and address those in line. We’ll have a few minutes at the end, but keep your examples and questions coming. We’ll also do our best to try to address those live.
Erin Mote on AI Literacy and Policy
Tom Vander Ark: So, Erin, kick us off with an update on EdSafe AI Alliance.
Erin Mote: Well, Tom, thanks so much for having me today. I certainly want to call out the 50-plus organizations and folks who helped us put the blueprint for AI literacy together in record time. Some of them are on this call, like Julia, and I really appreciate this vision. I will say—and this is an insight I want to share because it’s going to tie into my remarks—when we started on that blueprint for AI literacy, we really had targeted policymakers. Obviously, there’s a lot of discussion right now at the federal and state levels about what AI literacy is and how it intersects with education and workforce.
It was during the middle of the convening, as we were working with folks in the field from higher ed and industry, that we decided to not just write recommendations for policymakers about AI literacy, but also for school district leaders and states. There was a real hunger for the answer to, “How do we get started? What do we do? How do we engage in this conversation and demonstrate state leadership and district leadership around conversations, not just with students and educators, but with families and communities?”
I really think as we’re moving into the launch of not just new models like GPT-5 and new uses of this technology, we are hearing this clarion call for AI literacy so that we don’t tackle this arrival technology in the same way we tackled social media. Maybe I’ll say something a little controversial here, which is I think many of us in the education space feel like we should have done better. Schools and districts may have abdicated some responsibility in these conversations. Certainly, as a school leader, I think when I was running a school in Brooklyn, I would have conversations with my teachers, and they would say, “Well, we don’t allow those tools in the school, so it’s not really our job.” I think it’s everybody’s job right now with this arrival technology to talk about AI, particularly as we see young people using this tool for relational AI.
So, that’s the work we do at EdSafe AI Alliance. We’re not just talking about generative AI. We really talk about the full spectrum of AI—synthesis AI, and so on. But we’re radically focused on the AI and education use case. You’re not going to see EdSafe AI talk about AI in healthcare or AI in energy, except when we talk about the need for investments in public infrastructure. We’re going to continue to be focused on the safe use of AI in education.
We were founded in 2020, well before the breakthrough of ChatGPT. We announced it at that very weird ASU-GSV Summit during COVID in August, where we all got COVID afterward. It will forever be in my mind. Our mission is to bring together an uncommon alliance, including our 160-plus member industry council, to have hard conversations about the safe, accountable, fair, and efficacious use of AI in education. That is the SAFE framework. We work at the policy level—district, state, federal, and international—and with other countries. We’re focused on using this framework to help drive an adoption and approach to AI in education that asks really important questions about the use of these tools in the education use case.
We run district policy labs and are about to announce nine new state policy labs in September. Over the last two years, we developed this idea of a policy stack. Tom, you’ll probably smile a little because that comes from my days as a technologist in the tech stack. This idea is that when you’re doing policy work at the district level, state, or federal level, there’s really a set of activities that you can do to set yourself up for success. How do you anchor policy development in a six-stage approach? From there, we came up with this tool, which I think is really helpful. Frankly, we’ve seen deep adoption of this work. It creates air cover for states, districts, and others in this space. It says you don’t have to start at the top of procurement policies in this stair step. You should first start thinking about the position you have around AI, centering in your community context and values conversations. Then, start with AI literacy, feedback loops, and existing policies before tackling some of the most gnarly questions our sector is facing right now around AI implementation in education.
We’ve been lucky to work with a number of districts who have gone up this stair step—some just getting started—but all the resources we’re developing, including this document in the policy stack and with all the districts and states participating in our policy labs, are open source. Please look at them as starter dough to begin thinking about this conversation around equipping communities, teachers, educators, and districts from the state level in driving safe adoption in our schools.
Discussion on AI Safety and Student Engagement
Tom Vander Ark: Before we do, Julia, do you have anything you want to highlight, underscore, or prompt based on what Erin has presented?
Julia Fallon: I just want to say, one, we’re a proud member of the EdSafe AI Alliance, and we’re really grateful for the work that InnovateEDU is leading in this space in particular. The one thing that really stood out to me is providing cover. So I want folks to really understand that states—not that they’re hesitant—they’re conservative by nature, in essence, right? They’re a government agency and everything else, and they’re really trying to not do anything to upset anybody or anything else. They’re trying to meet the needs of their state residents and everything else. But providing that cover is such an important activity that I know EdSafe AI does. I know that InnovateEDU does through the EdSafe AI Alliance, and there are many other organizations that do that too. It just helps us, one, come together collectively, right, and kind of have the best practices and collective consensus around ideas. It’s something that is so politically needed right now, especially with things being very divisive. I want to say we all work together. There are no red states or blue states here. It really is a collective effort of folks going forward and tying it to whatever their state priorities are. So I want to just put that out there—that it’s a very delicate dance that states have to play. I know that some folks, maybe at the district level or even in communities, are like, “Why are states not doing more?” It’s not that they’re not doing more. They’re trying to do the best that they can with all of the different competing things that are happening. So I’ll just add that.
Tom Vander Ark: In the spirit of being provocative, I want to ask both of you: What you’ve described and what we’ve talked about so far is mostly harm mitigation or prevention. Are we doing enough to invite learners to do important work, to use AI to add value in community-connected projects? Corey Mohn is on from CAPS invites learners to do client projects and have entrepreneurial experiences. I’m really excited about the potential of students doing important work. Erin, is that sort of proactive approach dangerous? Is it premature? Are we right to have such a focus on prevention and care at this stage?
Erin Mote: Yeah. I mean, let me say one thing, and then I’m going to pivot. First of all, the conversations I have with every district or state leader or federal leader when I’m talking to them about AI is they’re really worried about minor data and the use and commercialization and monetization of minor data, right? That is a huge concern. Safety is sort of the only starter conversation that I have. When we can address safety, we can have lots of conversations about what we need to do next. If any of you have heard me talk before, I believe AI is an arrival technology like electricity, like the internet. There’s sort of an incumbency here for us to help folks who aren’t maybe ready to take that leap to get a solid foundation to be able to do that work that Julia is talking about—to go have that social license to innovate, that social license to explore how students can better drive their learning.
Listen, we need to move from a schooling system to a learning system. Tom has been hearing me say this for more than a decade. I think I should just get a tattoo. We’ve got to do it. We are missing the mark on preparing young people for what the world looks like. This is work that John Seely Brown has talked about for probably more than a decade, which is students are whitewater kayakers. The schooling system that we have does not teach our students how to be whitewater kayakers, how to read the currents to really move forward. Christian and I were on a session yesterday for EdSafe AI Fellows, where one of our speakers said, “Discernment is the new literacy.” So I want folks to be thinking about that. How do we scaffold and create learning experiences that drive discernment, that help young people learn how to manage AI, and to also tackle the core question that we’re seeing in the chat, which is, “Should I be using AI for this?” And what is the impact of AI when I’m using it? Every time you put a prompt in, it’s like throwing a water bottle in the trash. I just want to remind folks of that.
Tom Vander Ark: I appreciate that. I think you gave a great “and both” answer—that we can both invite students into literacy and invite them into using AI in creative and important ways. And we have to start with safety. That’s a big deal. It has to be a state priority. I appreciate that. Josh, thanks for the Joe Wild plug on being solutionaries. We love that idea.
Julia Fallon: Can I add one thing to that? Because I agree that what I love about AI—and this is like a secret of mine—is it’s forcing us to have really hard conversations about what is school and what does it mean to design learning environments in a modern way, right? Not just layering on more technology, because if we were really doing that during the pandemic, everybody would be happy these days, right? It’s not. Yes, we need to actually rethink the systems. AI is forcing us to rethink those systems. I would like to see state education agencies, in particular, leading that work, modeling what that looks like—not, you know, and changing how we’ve kind of done things. It’s an opportunity, and I always think about how to take advantage of these types of opportunities.
Tom Vander Ark: I’ll just use that as an excuse to say a lot of what we’re seeing from EdTech vendors right now is just automating bad pedagogy, and we don’t need any more of that. So there’s an “and both” here of rethinking school learning experiences, inviting young people to do important work, and, as both Erin and Julia have said, safeguarding their data and making sure their experiences are both safe and effective.
Colorado’s Approach to AI in Education
Tom Vander Ark: Let’s hear from Rebecca about what’s happening in Colorado. The Colorado Education Initiative (CEI) has helped develop this roadmap for AI in Colorado, and there are so many really exciting things happening on the AI front. Rebecca, what’s going on?
Rebecca Holmes: It takes an incredible amount of hubris to present after Erin, so I’ll echo what everybody will say on this call, which is just enormous gratitude for your leadership, Erin. Tom, I wrote down your adjectives—dangerous and premature. I think that’s exactly what we decided to be in Colorado, but only by state agency standards. So I think we’re in the right spot.
Some of this will require a little context. CEI was founded out of our state agency 18 years ago to be many things, including an R&D lab and to push an innovation agenda that the state agency realized it could not. I would encourage you, if you’re in a state that doesn’t have something that can take that role right now, to try to find or spin up something like the role that we are fortunate to get to play.
We saw the first two state K-12 reports come out 18 months ago, and our state agency was in the middle of a leadership transition. We said, “This won’t happen in Colorado in this moment.” We have the incredible privilege of sitting, like I said, outside of the agency. It meant, as a bridge to the conversation we’ve had so far, that we could put out a state roadmap that certainly mentions data privacy and acknowledges guardrails but took a stance as being much more about the potential for reshaping learning than the guardrails that the state would have to pay the most attention to.
This roadmap came out 13 months ago. I’m writing in pencil because it already seems so dated, but it is available for anybody who wants to see it. Essentially, what that’s allowed us to do is raise some quick resources to support districts in the state.
The other thing that’s important to always understand about Colorado is that we always argue if it’s us or New Hampshire that has the highest degree of local control. What that means in a moment like this, this arrival technology moment, is that we could have 178 different approaches to policy. We can have 178 degrees of opportunity—or not—for young people. So we always will meet this moment with, “How do we support early adopter districts to learn from each other so that nobody has to go alone?”
Those early adopter districts have leaned in, in part because of the philanthropy you see we’ve been able to attract here, and are looking at these four buckets of district approach. We are starting with AI literacy and AI readiness as the first bucket and letting policy come where it is appropriate, but not putting that first, as I mentioned.
The second category here is around district infrastructure. I think the exciting thing about this moment is it can be the first time that larger school districts can really identify who their early adopter teachers are, their early adopter schools, and give them something that nobody should be ignoring but create the conditions for not the district to tell the schools what to do, but for our most innovative educators to feed up to a learning agenda that districts really can’t and shouldn’t be ignoring.
That takes us to our really big commitment here around shared learning structures. This is about trying to make sure that what folks learned just 90 days ago gets picked up really rapidly by other school districts. Then there is certainly a role, I think, for states—even outside of this kind of protectionist role—that we do think the state agency needs to play, but also to think about the way that state agencies provide guidance. When I was associate commissioner, we provided guidance and then ignored that guidance until five years later when legislation told us to revisit it. There’s a really interesting, nimble policy way I think that we can ask state agencies to be thinking about guidance issuing in a totally different manner right now, which we think is really exciting.
On the policy front, I mentioned the roadmap. I’ll go quickly through the other things that we are supporting districts to do right now in Colorado. The first is the largest investment of those resources. It’s state money, and it creates a project called Elevate AI. We were able to get eight districts—by design, four small and rural, and four relatively urban and large. We are the place people actually come to whitewater kayak, so urban and large means something different in Colorado than many of the places that you all are from. But this is rapid support with systems leader support, superintendent coaching, new professional development for whoever is running edtech in these districts—and sometimes that is the superintendent because some of these places are quite small—teacher fellowships, and a lot of infrastructure for that shared learning.
Another shoutout here to AIEDU, who’s been super helpful at this. We did launch this with something we don’t want to continue doing, although I love Magic School. We launched this with one product—not that everybody has to use it—but we said the easiest thing will be if there is some kind of product attachment for the teacher fellows to come together on. We chose Magic School, and I’m happy to talk more about that if people have questions just for this project.
The second project here is that my team really doesn’t want this on the slide—it comes out next week. They keep saying it’ll be in beta for months. But we said we’re going to see lots of curriculum products, and to Tom’s point, lots of people automating bad pedagogy. But what we really want to see is a K-12 competency progression. So we have worked to release that next week. It comes out with so much humility—it is a messy first draft. If other states are working on something similar, we would love partners on this. But we felt like it was the kind of way we could lean into that state agency role and play something that would be useful for districts who are at all stages of adoption.
Utah’s Approach to AI in Education
Tom Vander Ark: Matt Winters is going to tell us about what’s happening in Utah. I know, Matt, you’re in the thick of it. You’re meeting with a group of parents today. What are you going to talk about?
Matt Winters: That actually already happened, and that’s a great lead-in—taking in all stakeholders. I think that’s one of the biggest things that has been a part of the work I’ve been doing here at the State Board of Education in Utah over the last year and a half. This conversation doesn’t just involve technologists or teachers; it involves everyone in the ecosystem.
One of the things I want to start off with is, as we talk about Utah in the AI space, I’m inheriting and standing on the shoulders of some really wonderful ecosystem work that’s happened over the last couple of decades here in Utah. Whether it’s the work through Utah Education Network (UEN) that has built the ISP for our public schools across the state—where it’s free for public schools through the legislature—or contracting work, the portrait of a graduate work that’s been happening here at the State Board, or our digital teaching and learning grant that’s provided computers and coaching, it’s been an amazing ecosystem to inherit. Utah has been able to jump in feet first and do some really cool things over the last 18 months.
One of the first things that we worked towards, because of our wonderful ecosystem and community here, was that we had a group of technology directors and coaches come together from across our districts and charter schools and say, “We want safe, cheap, effective AI tools in our classrooms.” We went to Utah Education Network, which does a lot of the consortium pricing for the state of Utah, and we ran an RFP for AI tools. Now, in the state of Utah, any publicly funded school can get SchoolAI, Magic School, or Skill Struck—and now we’ve added Brisk as well—at a severely discounted cost per student per year for use in the LEA.
That’s been incredibly useful for a couple of reasons. One is it’s helped with getting data privacy in place for tools that our teachers are already using. With those statewide contracts, data privacy is part of that. When an LEA signs onto that contract, we have data privacy assured through UEN. On top of that, it has provided a decrease in what I’m hearing in the ecosystem as the “AI divide,” where districts would like to use AI, but they maybe don’t have the funding or the infrastructure to be able to do that. In Utah, we’ve cut down and reduced that friction to enable LEAs to make really interesting choices about what kind of tools they’re using and then start supporting that work.
Statewide AI Tools and Training
Matt Winters: The other thing that we’ve been working on in terms of ecosystem is that we’re uniquely poised here in the state of Utah because we have a first-of-its-kind office called the Office of AI Policy. Their URL, if you’re interested, is ai.utah.gov. I am in no way jealous of that URL at all. They’re a policy lab that works on local policy and works to understand specific aspects of life and how it associates with AI. The first thing they looked at was mental health and chatbots, and then they provided legislative and policy work to our legislature. This last legislative season, in February and March, they passed a first-of-its-kind bill around mental health chatbots and how they should interact with humans, but also how they should work as mandatory reporters. We’re working through that at the State Board right now to figure out how that should work in schools. That office—again, first of its kind in the country—has been really useful, and I’ve been lucky to work with them in our larger educational and AI initiatives.
The other big thing we’ve been working on throughout the state of Utah is teacher skill development. I go back to what Rebecca said about our need for teachers to be upskilled and to understand what the technology is and how it works, but also for our parents and students as well. Thanks to a $500,000 grant from Intermountain Healthcare, we’ve been able to roll out training for teachers across the state of Utah—not just in one LEA, but any LEA that would like to have us come out and speak with them. Since April, we’ve gone out with the intention of training about 2,000 teachers. We’re hitting the 5,000 mark in the next couple of weeks, which is roughly a sixth of our teaching population. That’s for the first layer of training. The second layer is a Canvas course, and we have about 1,000 teachers in that. Of those, we’ve paid about 300 of them for lesson plans they’ve created around using AI or talking about AI in the classrooms through DonorsChoose.
In that, we’ve gotten some really great feedback from teachers. These are direct quotes from the documentation. One of them said, “The experience helped me see AI as a scaffold, not a shortcut.” Another said, “AI was the statistical to my students’ Eberts,” which I just absolutely love. But my favorite one is this teacher who said, “This is the first time I’ve ever explored AI in lesson or unit planning. It comes at a turning point in my career. I was literally considering giving up teaching due to the increased workload, stress, and feelings of burnout that had begun to build up over the years of teaching.” She decided that now she thinks she could handle it. We’re seeing this really great growth in our teachers of understanding what the technology can do, how to use it appropriately, how to be data safe, and how to approach using AI tools to meet the needs of their students. It’s been really beautiful to see that work happening.
We’re closing out that grant on September 30 and putting all the lesson plans up on OER Commons as soon as possible after that so that other researchers and other states can look through them and build into it.
Federal and State AI Policies and Research
Tom Vander Ark: Josh asked about business partnerships, particularly those that might support effective uses of AI. Maybe that’s a client project or an internship that might incorporate AI. Corey just noted before that the CAPS network, which is usually an upper-division set of experiences for young people to do professions-based learning, is expanding in Utah. That might be part of the answer to Josh’s question. Are there going to be more business partnerships supporting AI? Anything on that front, Matt?
Matt Winters: Yeah, absolutely. I’m working with different organizations in the state of Utah, one of them being Talent Ready Utah, which is a big workforce initiative through our Utah System of Higher Education. We are actively pursuing creating some CTE (career and technical education) pathways and courses that will help build the capacity around AI. Part of that process is identifying industry partners that would be willing to work with students who are wanting to do internships and help build that next generation of engineers and AI builders.
On the flip side, there is a push in our CTE department to create AI literacy in K-8 as well. That is something my colleague, Christina Yamada, is working very actively on. Hopefully, we’ll have some things out in the next month to help teachers navigate what AI literacy looks like as they build alongside the computer science pathways that she’s built over the last decade. There’s some really interesting work in terms of the industry connection here in the state of Utah.
Tom Vander Ark: We had a number of comments in the chat about cognitive offload—these long-term questions about what life and learning with AI mean. You and your colleagues on the board are thinking hard about that. I love the level of teacher outreach that you are doing now. It’s really exciting and an example for the rest of the country.
Julia Fallon on Federal and State Leadership
Tom Vander Ark: Let’s hear from Julia Fallon, who leads the State EdTech Directors Association (SETDA). Julia, what would you like to say about Matt’s comments and add about what SETDA is doing?
Julia Fallon: I feel like I’m a panel respondent, like the old-school NSF kind of things. But the thing that I really love about Utah—because everybody’s always surprised—they’re like, “Who’s leading edtech strategy?” I’m like, “Utah.” And they’re all like, “Utah?” And I’m like, “Well, they are.” They’ve really focused on that ecosystem, and that is what has allowed them to move so quickly in this space. That is true leadership from the state—not only the state education agency but also from their legislature and other agencies. They’ve all got it together about what is best for their communities, really making sure that even in the most remote areas, there’s connectivity and students are getting those types of experiences as well. So, always kudos to Utah, and also to Matt for having the distinction of being one of those people dedicated to supporting this specific type of work.
We’re seeing that in other states as well. We have a handful of those. One of the things I want to mention about what SETDA is looking at is our State Trends Survey, which comes out every year. In about three weeks, we’re going to be launching our 2025 edition. For the first time since we started doing this, cybersecurity got bumped from the top. AI is the number one thing. I’m not doing any spoiler alerts here, but to nobody’s surprise, AI is giving cybersecurity a run for its money. Cybersecurity is still considered a priority, but AI is the conversation that states are having and grappling with—how to meet this moment.
Most of our respondents—47 states responded—indicated that they currently have an AI initiative in place, such as Colorado and Utah. You’ve seen a lot of the policy work that had to happen ahead of those initiatives. In our report, which is coming out, we showcase what those initiatives look like. We did a roundtable to highlight what those things look like across the country. I’ll give you a little teaser: a lot of it is around professional learning. It’s about how to prepare the teaching workforce to integrate these things into their instructional practices and classroom design experiences.
There are also states that have hired dedicated people to support that work. I expect we’ll see more of that as an example of a strategy that a state can employ to coordinate across the agency. This is where the modernization part happens—thinking about where it all integrates together within state agencies and all of the initiatives going forward. For example, if your priority as a state chief is chronic absenteeism, AI and technology can help with those things. How do you coordinate across initiatives? Often, what SETDA folks grapple with is, “What is the state’s role?” We know about local control and local context. It’s really about how the state can either build capacity or support local districts as they make these decisions, including our role in our most under-resourced districts. We don’t want them to be left behind in any way.
Tom Vander Ark: Julia, it’s interesting that in the last 60 days, the leading model providers have all provided a study mode and are increasingly making commitments to education, providing tools, and in some cases, supporting teacher training. Is that a good sign? Are you encouraged by that?
Julia Fallon: It is a good sign. We’re having more conversations. If you’re in the policy space and there are federal dollars coming out, we’re also going to be launching a Title II guide in November around how to use federal funds that come to states and districts for professional learning and how AI literacy is important. How can you pay for these things? Budgets are very constrained right now, given what’s happening across the country. How do you use those dollars? You could still be thinking about math scores and math instruction, but how does AI and technology support those things? How do you use some of these federal dollars to do all of that?
I feel like we’re moving in the right direction. I feel like we’re having those meaningful conversations. We’re not doing this willy-nilly, but I do worry because we do have a whole section in the State Trends Report where we talked about cell phone bans. States came out, and we’re a little bit worried about it becoming a device ban overall—not just cell phones. It’s like, “We’re going to get rid of the laptops. We have tech-free days,” we’re hearing in certain states. We’re really worried about that because I don’t think you can take it all away. Again, it’s going back to that: How do we be intentional? How do we balance it? How do we help teachers still be great at what they’re doing but not have to dictate to them exactly how they do it? That’s not what we want. We want technology to free them up to build those relationships that you need to have in order for learning to happen.
One of the things—and I know, Tom, you’re familiar with this—is we are also looking at the absence of research in this space. In February, as a country, we experienced a sort of obliteration of our education research space. We need that, right? We need that evidence-based research because we want best practices out there and to do the research. So, SETDA is trying to stand up a specific research center primarily for edtech strategies so that we can help states replicate. A lot of times, I like to copy and paste from other states. They’re all very competitive—I’m not going to lie about that. They are competitive with one another, but they also like to borrow ideas and say, “Hey, you went first.” For example, California and Washington went out first with policy. We had them come in and say, “What would you tell your legislatures to do?” We want to be able to replicate that stuff so that we can have faster policy development and implementation phases.
Tom Vander Ark: We appreciate your impulse to get more states involved in active R&D. We’ve also highlighted the work of ALI, the Alliance for Learning Innovations, and super appreciate their leadership on the R&D front. Erin, do you want to step in there and talk about that? You could also comment on federal leadership and how positive it is in some cases and how dangerous it is in others.
Erin Mote: Yeah. Wow. Give me a doozy. So, on the education R&D front, I want to tie it back to your other comment. One of the things I think we’re trying really hard to encourage folks to do who are in the industry—because I think right now, as Julia mentioned, the decimation of education R&D at the federal level—industry is actually putting a ton of money into AI and education research, the learning sciences, and so on. One of the things that we’re really pushing at EdSafe AI, both through the industry council and more broadly, is that the “E” in the SAFE framework is where we are weakest. We do not yet have the efficacy research at scale to know and to equip folks with evidence-based practices and approaches for AI in learning sciences and pedagogy. We’re getting there, and industry has to be a partner with us in being more transparent about what they’re learning in their own research projects—what they’re learning about things like model welfare and how that affects the relationships that models and tech design have in encountering humans or encountering new knowledge.
That’s a big push that we’re making at EdSafe AI. We are also really pushing on public infrastructure. I will say this idea that there needs to be a public infrastructure for research and development at the federal level, but also at the state and local levels, that appropriately considers different contexts. We talked a little bit about our great district up in Canyon City, Colorado. They are in a really different context than our other policy lab in New York City. But they share a lot, and there are different contexts that they’re operating in. How do you really take that on?
There are places, I think—and listen, there’s a lot happening at the federal level. Everybody who knows me knows that I’m a deep, honest broker who believes in radical candor. There are things that are not great that are happening. Julia mentioned the massive cuts to federal R&D, but there are things that give me sort of a raise of hope. I shared an opportunity in the chat for states right now to apply for some funds that are rapidly coming out of the federal government for the AI workforce and to tie AI workforce opportunities to education. I’m happy to see the work that’s coming out of NSF slowly around AI test beds and the work that they’re doing to provide research infrastructure for computing in line with what we’re calling for in the public infrastructure report.
The AI challenge, I think, could be an important opportunity for schools, communities, and educators to really shape this administration’s prerogative on the challenges that are in communities. Then, frankly, AI literacy—I said it’s the siren song, the clarion call we all need to answer. There are resources going to those initiatives, and I’m happy to see the Department of Education, the White House, and others signaling how important that is.
Tom Vander Ark: Thank you, Erin. That’s a great summary of both the challenges and opportunities at the federal level.
Julia, you mentioned earlier the importance of research and evidence-based practices. How do you see states leveraging this research to create meaningful change in AI adoption?
AI Literacy in Middle and High Schools
Julia Fallon: That’s a great question, Tom. I think one of the most important things states can do is to ensure that the research being conducted—whether it’s federally funded, state-funded, or industry-supported—is accessible and actionable for educators and policymakers. Too often, research sits in academic journals or behind paywalls, and it doesn’t make its way into the hands of the people who need it most.
One of the things SETDA is advocating for is the creation of open-source repositories where states can share best practices, lesson plans, and case studies. This would allow districts to learn from one another and avoid duplicating efforts. For example, if Utah has already developed a successful AI literacy program for middle school students, why shouldn’t a district in Georgia or Illinois be able to adapt that program for their own use? Collaboration is key, and we need to break down the silos that often exist between states and districts.
Tom Vander Ark: That’s a great point, Julia. Erin, do you see industry playing a role in this kind of collaboration?
Erin Mote: Absolutely. Industry has a huge role to play, not just in funding research but also in sharing the insights they gain from their own product development and user testing. For example, many edtech companies are already collecting data on how students and teachers interact with AI tools. If that data can be anonymized and shared responsibly, it could provide valuable insights into what works and what doesn’t.
But it’s not just about data. Industry also has the resources to support professional development for teachers, which is something we desperately need more of. Teachers are the front line of AI adoption in schools, and they need to feel confident and supported as they integrate these tools into their classrooms. That’s why partnerships between industry, states, and districts are so important. They allow us to pool resources and expertise to create solutions that are scalable and sustainable.
Tom Vander Ark: Speaking of professional development, Matt, you mentioned earlier that Utah has trained nearly 5,000 teachers in AI literacy. What kind of feedback have you received from those teachers, and how are they applying what they’ve learned in their classrooms?
Matt Winters: The feedback has been overwhelmingly positive, Tom. Many teachers have told us that the training has completely changed the way they think about AI and its potential in education. One teacher said, “This is the first time I’ve ever felt excited about using technology in my classroom.” Another said, “AI has given me back time that I can now spend building relationships with my students.”
In terms of application, we’re seeing teachers use AI in a variety of ways. Some are using it to differentiate instruction, creating personalized learning plans for their students. Others are using AI tools to streamline administrative tasks like grading and lesson planning. We’ve also seen teachers use AI to engage students in real-world problem-solving projects, like designing sustainable cities or analyzing climate data. It’s really inspiring to see how creative and innovative our teachers can be when they have the right tools and support.
Tom Vander Ark: That’s fantastic, Matt. Rebecca, are you seeing similar innovation in Colorado?
Rebecca Holmes: Absolutely. One of the things we’re most excited about is the way AI is being used to empower students as creators, not just consumers. For example, we have a middle school in Denver where students are using AI to design and prototype solutions to community challenges. One group of students created an AI-powered app to help their peers manage stress and anxiety. Another group used AI to analyze traffic patterns and propose changes to make their neighborhood safer for pedestrians and cyclists.
These kinds of projects not only teach students valuable technical skills but also help them develop critical thinking, collaboration, and communication skills. They’re learning how to use AI as a tool to solve problems and make a positive impact in their communities. That’s the kind of learning experience we want to see more of, and we’re working hard to support districts in creating those opportunities.
Family Involvement in AI Literacy
Tom Vander Ark: That’s such an important point, Rebecca. AI has the potential to transform education, but only if we use it thoughtfully and intentionally. As we wrap up, I’d like to ask each of you to share one piece of advice for educators and policymakers who are just starting to explore AI in education. Erin, let’s start with you.
Erin Mote: My advice would be to start small but think big. You don’t have to have all the answers or a fully developed strategy to get started. Pick one area where you think AI could make a difference—whether it’s improving student engagement, streamlining administrative tasks, or enhancing professional development—and start there. But as you do, keep the bigger picture in mind. Think about how AI fits into your overall vision for education and how it can help you achieve your long-term goals.
Tom Vander Ark: Great advice. Julia, what about you?
Julia Fallon: I would say don’t be afraid to ask for help. Whether it’s reaching out to other states, partnering with industry, or tapping into federal resources, there are so many people and organizations out there who want to support you. You don’t have to do this alone. Collaboration is key, and the more we work together, the more we can achieve.
Tom Vander Ark: Matt, what’s your advice?
Matt Winters: My advice would be to involve teachers and students in the conversation from the very beginning. They’re the ones who will be using these tools every day, so their input is invaluable. Listen to their ideas, their concerns, and their experiences. They’ll not only help you make better decisions but also ensure that the solutions you implement are practical and effective.
Tom Vander Ark: And Rebecca, you get the last word.
Rebecca Holmes: I would say don’t lose sight of the human element. AI is a powerful tool, but it’s just that—a tool. At the end of the day, education is about relationships, about helping students discover who they are and what they’re capable of. Use AI to support that mission, not replace it.
Tom Vander Ark: Thank you, Erin, Julia, Matt, and Rebecca, for sharing your insights and expertise. And thank you to everyone who joined us today. This has been a fascinating and inspiring conversation. Let’s keep it going. Together, we can harness the power of AI to create a brighter future for all learners.
Links
- Watch the full video here
- Mollick’s Prompt on GPT-5
- Blueprint for Action – AI Literacy
- Colorado Education Initiative
- Colorado Roadmap for AI in K-12 Education (PDF)
- Fireflies Live Notes
- New Yorker – Will the Humanities Survive AI?
- OpenAI Progress
- LinkedIn – Top AI Use Cases 2025 vs 2024
- LinkedIn – Sinead Bovell
- NYT Opinion – ChatGPT & Mental Health
- Humane Education – The Solutionary Way
- Hidden Brain Podcast – How Our Brains Learn
- MSA Blog – The EU AI Act
- MidPac – Shaping the Future AI Council
- What School Could Be – AI in High Schools
- LinkedIn – Warren Bowles
- MIT GenAI – Environmental Impact
- MIT News – Generative AI Environmental Impact
- Guardian – Ohio University AI Training
- Getting Smart Podcast – Don Haddad
- Your CAPS Network – Utah Career Education
- Apple Podcasts – School & Company Partnerships
- Ed3DAO – Portrait of a Teacher
- Gabriel Yanagihara Substack – Switzerland’s AI Approach
- Jungle AI App
- ALI Coalition – State Brief
- Getting Smart Podcast – State of Education R&D
- Canon City Schools
- SETDA – State EdTech Trends

Tom Vander Ark

LevelUp Online Education
This discussion on the state of AI is incredibly timely and insightful. I appreciate how the panel addressed both the opportunities and ethical challenges AI brings to education and beyond. The emphasis on balancing innovation with responsible use really stood out. It’s refreshing to hear diverse perspectives on how AI can enhance learning experiences while still prioritizing human connection and critical thinking.