Town Hall: Back to School with AI

Key Points

  • AI can help educators focus more on human interaction and critical thinking by automating tasks that consume time but don’t require human empathy or creativity.

  • Encouraging students to use AI as a tool for learning and creativity can significantly boost their engagement and self-confidence, as seen in examples from student experiences shared in the discussion.

This Getting Smart Town Hall focused on the integration of artificial intelligence (AI) as we prepare for a new school year. Nate McClennen facilitated a conversation with featured speakers Wes Kriesel, Kunal Dalal, and Jerry Almendarez, who are doing pioneering work in California. Listen in as the discuss the growing use of AI tools among students and educators, highlighting the importance of understanding AI’s role in education.

The speakers discuss various aspects of AI, including its potential to augment human intelligence and the need to focus on uniquely human competencies in the face of technological advancements. They also emphasize the significance of student agency, with examples of student-led initiatives and feedback sessions that reveal how young learners are already engaging with AI in innovative ways. The episode underscores the necessity for educators and administrators to stay informed and actively participate in the ongoing dialogue about AI to ensure its effective and equitable implementation in schools.

Outline

Introduction and Setting the Stage

Nate McClennen: Welcome everybody. Excited to have you here today. We’re joined by a great crowd today from Orange County Department of Ed and Santa Ana Unified School District. Uh, Bianca can’t be here today but represented with Jerry Almendarez and then Wes Kriesel and Kunal Dalal. I want to give a quick preamble. I always think we need to start with just basic understanding. So I gave a quick prompt to ChatGPT to create a visual that articulates the relationship between AI, machine learning, deep learning, neural networks, NLP, and large language models. So it gave this amazing-looking thing that means nothing. And so it does show us the limitations of deeper learning and the limitations of ChatGPT and the limitations of large language models themselves. So it’s not doing a good job here. So in the next slide, you’ll see our version of it, which is just a real basic preamble. And I think that as we start to use all these tools and we’re using them in classrooms and we’re helping parents understand them and teachers understand them and administrators, we should have a basic grasp on what’s going on here.

So AI obviously is this technology that’s simulating or enabling machines to simulate human learning. So that’s the big bucket and there’s a lot of AI, and it’s been around for a long time, coined in the mid-1950s, so a long time ago. Machine learning is a subset of AI where AI systems are making predictions from historical data. And the easiest example of this is I was a high school math teacher for many years, and we always taught regression in a statistics course. And regression is simply taking past data and making predictions about future data. And so that is in some ways an art. When you have technology doing that for you, it’s an AI system. And then a subset of that’s emerged more recently is the deeper learning, the deep learning, which is using a neural network, which is trying to simulate the human brain in some way with multiple layers and networks. And then finally, we get to the piece that most of us are using now who are in the AI world, which is this generative AI.

So these deeper learning models that can create original content based on the data that’s inputted. So just this idea of, in its very fundamental form, there’s existing data in a digitized world where everything that we as humans have been able to put into the digitized world or into digitized form because it could be video turned into digitized form or sound training up a large language model, and then that’s able to produce new data based on the probability of the next word or the next image being a good representation of that. So in the center of the dystopia and the utopia is ideally this idea that we know that it’s going to be in the middle ground here, that AI tools and functions will be embedded in ed tech ubiquitously. We think a lot about this idea of augmenting intelligence for learning.

I know Vriti’s online and Vriti and I presented about this a bit before, is just this idea of how can we take intelligence and augment it for learners. And so we think that’s a strong middle ground. Something that we’ve been thinking about a lot is this idea of what are the critical human competencies or uniquely human competencies that we need to double down on in the face of an AI revolution. And so we have to pay attention to those competencies and where they show up in portraits of graduates and what are the things that are uniquely human. And then I think that one of the biggest shifts that’s going to happen is this educator role change. And what does it mean to take away some of the things that educators have done in the past that maybe can be done by AI and then reframe as coach, reframe as a connector and mentor, project maker, community connector, etc.? And that reframing is going to be really important. And I think it will be really good for young people.

The Rise of AI in Education

So a couple of things about who is using AI. This is a great survey. Walton Family Foundation commissioned it from Impact last May, and it really has some interesting data. The bottom line is, at least in the survey, 50 percent of teachers and K-12 students are now using ChatGPT in the specific question at least weekly, both in and out of school. So it’s a huge increase, about 25% to 27% increase from last February of 2023. And so I suspect that this will go up increasingly over the course of the next year or two. Access will be important to make sure that all students have access in appropriate ways. So that’s the first thing—more and more students, more and more teachers. And it’s important to note that parents as well are using it more and more. And so how are they using it? I think there are two major ways. They’re either using it as the base model—they call it the website here in the survey, but the base model like ChatGPT or Gemini or something like that—or more likely they’re also using it as a third party, another app that’s using some of these large language models as the base.

So some of the ones that we’ve been paying attention to that are out there, this is not an exclusive list. There are many great platforms out there, but Magic School seems to be a go-to for a lot of teachers because it’s a one-stop shop. Diffit is a great differentiation tool that we’ve seen some folks use. Inquire has partnered with High Tech High around their kaleidoscope work to try to figure out how to do great project-based planning for using the High Tech framework. PlayLab is being able to build your own apps, which we really think is important around agency for young people and for the educators that work with them. And then SchoolJoy is another app or software that we’ve been paying attention to that’s doing a lot of schools to careers, to making projects, to again trying to be a one-stop shop for AI tools and to make it easy for educators to implement. So two different ways—both using the raw websites or the raw models but also using all the applications that are embedding.

And just to wrap it up, just I think our general consensus as we look across the landscape is that definitely there’s increasing use. But there’s variable policy guidance, meaning some states have it, some states don’t. Some districts have it, some districts don’t. There’s variable professional learning. Some is emerging, but its quality is developing. And people don’t know where to find it. And then there’s variable student education. How are students learning about that? Some people, when they register, ask those questions about how do we teach students about this? How do we teach elementary students, middle school students, high school students?

And so we put a few folks that we interact with. So Teach AI has a great set of policy resources that are available on their website. If you’re looking for some really good slides to share and some policy examples, it’s a great place to start from. Ed3DAO—I know Vriti’s here today and in the chat. So Ed3DAO provides professional learning on all sorts of things from Web3 but also around how do we use AI for educators and a bunch of courses available there. And then the AI Education Project has a bunch of resources that are direct to students, which we think are pretty interesting. So again, this is not a comprehensive list. There’s a lot of people out there doing interesting work, but a few to think about.

All right, so we’re going to go to the most important part of this town hall. So basic preamble, probably all super familiar for you all. I’d love to hear Kunal and Wes talk about Orange County. So I think the general question we’re wondering is what’s happening in Orange County at this point in time with AI and how are you setting up for the new school year, and what are you thinking about in general? Not sure who’s going to go first, but take it away.

Wes Kriesel: But because this is new, before we jump into this next school year, I want to go back a year because our positions didn’t exist a year ago. And our county office put together monthly superintendent meetings around AI called OCAI Forward. But then we came in around November, December, and I wanted us to speak briefly about something called 100 conversations about AI in 100 days, which was when I became an AI administrator. Kunal and I, we just drove around the county and literally had hour-long conversations one-on-one with people asking them, “How are you using it? What is it? How are people using it around you? How did it feel? What are people, what do you hope for? What are the barriers?” Very just like six simple questions. An hour with, I would say, 90 percent of them talking. And we recorded these, we transcribed these, we analyzed them with AI, and this really felt like the foundation of our approach, I would say.

Engaging Students with AI

So one of the things we found out is, partway into those conversations, we had our first student conversation and we had to beg educators, “Who are the students using AI? Who are obsessed with it, who you can’t get them off of it?” And a teacher at Estancia High School in Costa Mesa said, “You’ve got to talk to this student, Kieran. I don’t know what he’s doing. He’s in the hallway on ChatGPT every day.” So this one interview with Kieran changed everything for us. He said when we got in the conference room with him, “ChatGPT has changed my life.” And we’re like, “Okay, that’s not how we thought this was gonna start.” The second thing he said is, “I’ve been waiting forever for a year to tell an adult what I’ve been doing. And I haven’t been able to talk to anyone about it.” Again, not what we expected. We said, “Tell us more.” He said, “I like anime. And when ChatGPT came out, I asked, ‘Can you write me an anime story?'” This was his dive into using ChatGPT four or five hours a day to write stories. And Kunal had the insightful question, “Did this do anything for you? You said it changed your life, but was there a moment when you knew you had changed?” And he said, “When my English teacher gave me a paper and pencil final in class and said, ‘Write an essay,’ I would be somebody who would shut down. I would turn off. I’d put the pencil down, I wouldn’t even try. I had no imagination, I knew I couldn’t do it. And when I got on that essay and started writing, I don’t know what happened, but I wrote not five paragraphs but seven paragraphs in 10 minutes and I was done. I amazed myself.” And he said, “I think ChatGPT rewired my brain.” Kunal said, “No, I think it rewired your confidence.”

But we’ve spoken with him again and we have taken this—I would call it a silence, paying attention to the silences. Who is not talking about AI? We’re talking about policy guidance, all these things, but we need to listen to our students and how they are using AI because they have a very different approach. This is a quote from Sophia Lee in Westminster High School. She said, “I see it as my life.” And when we talk to educators, they say, “I see it as my job. It’s impacting my job, my work.” And students have—they look at it through the lens of culture, they look at it through the lens of their personal passions and what they do outside.

So with that, that’s just a grounding. I’ll turn it over to Kunal. Maybe you can talk about the student AI convening that came out of that work.

The Impact of AI on Educators and Students

Kunal Dalal: Yeah. And I think one of the things is, you’ll notice we don’t—I don’t have a slide, we don’t have a slide deck that we’re going through, and that’s intentional. That’s not just laziness on our part, although maybe a little bit of laziness on our part. But when we do trainings, when we do sessions, we usually don’t have slides or we might have one or two slides. And the idea is that we keep going back to AI. It’s actually going to center us in our humanity. It’s not about the technology. It’s not about centering ourselves in some artificial intelligent tool or whatever like that. AI offers a reflection into our own work and into what we value. And the more we engage with it, the more we get to ask these questions about whether we get to ask questions about ourselves. And I do feel like that has been a little bit—some of us are more comfortable in that space and some of us are less comfortable in that space. And I think there’s been an uneven adoption of AI for humans, for people.

I think that’s actually a commentary on how far we’ve come in our comfort level with self-reflection and our comfort level with kind of staring ourselves in the mirror and really asking ourselves what our value is in our work. What do we do on the day to day? And so one of the things that came up over the past year was, we were at a conference, a wonderful conference. It was the Deeper Learning Summit in Anaheim. We brought some students there and there was a panel that was happening, and there were some corporate leaders up on the stage, and they were talking about students and what students need and how students can benefit from artificial intelligence and how they might learn best. And there were a couple of students at our table who said, “We’re sitting right here. They could just ask us.” But they’re really talking a lot about it up there, but they’re not asking us. And in that moment, I looked over at—we had our associate superintendent and some other folks at the table, and I was like, “Can we do a summit that’s just students? Students run the session, students do the keynote, students are attending, can we just do that? And can we not just do that? Can we do it in four months before the school year is over?” This was in February. We ended up pulling it off. We pulled it off. It happened at the end of April. And we had 14 breakout sessions. Every single one was a presentation done by a student about their feelings around AI. Some of it was technical, some of it was social. We had conversations around social justice and how AI can bridge conversations with folks, build empathy. And then we had conversations—we had sessions where students just asked their fellow peers about what they do with AI and built like a collaborative workspace session there.

And then finally, I think one of the coolest things for me, and I say this selfishly because I owned this process, and it went well, and I was excited that it went well because it could have gone—it could have been catastrophic. But catastrophic, that’s a little bit stretchy. Sorry, I can be a little bit hyperbolic sometimes, but you can see why I was a good science teacher, right? “Oh my god, it’s gonna explode!”

But we decided we wanted students to do the keynote, right? This is a student-run convention, all day they’ve got lunch, they’ve got other sessions. Why would an adult go up there and do the keynote? So we wanted students to do the keynote. So we had three students who did the keynote, and I worked with them over the course of the weeks preceding the session. And I said, “What do you care about? Let’s build this out.” And the consistent thing that they said was they wanted to figure out how to make this a collaborative keynote. How can we get every person’s voice who’s in this room into this keynote? Now, at the time, we thought that we were going to maybe have 100 people in the room, which is already big. We ended up having 600 people in that room. So it ended up being way bigger than we thought, and we had to do a little bit of—we had to call a few audibles.

But what we did was we had the students type in using one of the—it was Slido, but there’s Mentimeter, there’s a lot of those where they just text in a word and it creates a word cloud. We had that word cloud up on the big, huge projector at the front of the conference room. And then we took a screenshot of it, put it into ChatGPT, and said, “Make a story with all these words.” And we had 600 students who had just taken their phones out and started talking who became silent in one second. The second they started taking their phones out and started talking, and there’s 600 of them in a room, I was like, “It’s done. It’s over. We’ve lost the room.” If I did this with 20 kids in a room, I would have lost them. This is 600, and I don’t know a single one of them. But the second they started to see their words, the things that they had put into that, show up in the story, they were like, “Oh, did you see that? Oh my God.” And it became—you could have—they say you could have heard a pin drop. You really could have heard a pin drop in that room.

And so I share that because I think—I saw a comment earlier about student agency and student leadership in this space. And I think we have a lot to learn. And one of the things that Wes and I talk about is we always say we are not experts in this. None of us are experts in this. We are at best just a little bit, maybe a day or two ahead. But we are here to walk alongside our students. And that’s what our students are asking us to do. They’re like, “We know teachers aren’t experts at this. This is all brand new. This is brand new for everybody.” So I hope that the teachers will walk alongside us and learn this together rather than feeling like they have to know everything before they walk into a room and then teach us everything. And so that’s one of the cornerstones of our work, the student agency.

And then the final thing I want to put in is something that we talk about around—I heard something that reminded me of this. For a while now, with our digital world, we have been judging ourselves and we’ve been judged by our supervisors and folks that manage us. We get judged by essentially our digital footprint. How many slides did you create? How many emails did you send out? How many documents did you make? How many spreadsheets? How many cells in the spreadsheet did you fill out? We are really being judged and judging ourselves by our digital footprint, but that digital footprint is nothing but bits and bytes through cables. It’s just binary code sitting on a server somewhere. There’s nothing human about these spreadsheets that we make and these memos that we make.

But an AI model is never ever going to be able to hold the hand of a child who just lost somebody who’s close to them. An AI model will never be able to do that. An AI model will never be able to laugh so hard that it can’t stop laughing. You told it a good joke and it can’t stop laughing. An AI model will not be able to do that. These are things that humans, we as humans, do. And if we find a way to feel comfortable not judging ourselves anymore by the digital footprint we create, and we just say, “Okay, let me just figure out how AI can take care of a lot of that. Let’s refocus on the human stuff because that’s our job, right? We’re not accountants; we’re educators. And so our job is a human job.”

Nate McClennen: Can I just pause for a second? Cause we’re going to have to pivot to Santa Ana in a minute, but before, I just want to emphasize this point. Do you think either you or Wes, do you have a sense that AI is actually providing an opportunity for us to actually rethink about what’s really important about being human? Is this a parallel path that we can take that’s really important that schools can’t miss? Is that sort of what you’re both saying here?

Wes Kriesel: I feel like when we—I would say verbal production is the greatest measure, the most important measure of learning. It’s not about writing, so don’t quote me on that. But when I say what I think, I get to evaluate whether I said what I believed inside. And so verbal production, when we introduce AI as, “I’m not only able to talk to the instructor, the leader, the adult in the room, but I can talk to the AI and we can collaboratively hear this conversation in a different power differential. I’m not telling you my thoughts and you’re saying that’s right or wrong, but we’re listening to the conversation from a different angle.” I think the way that we can collaborate is drastically able to be changed.

Kunal Dalal: And there’s a great article that was released in The Atlantic just—I think it was this week, last week? Some of you probably have read it. It’s something about the rise of metacognition, and it’s just the idea of how AI is forcing us to go into this metacognitive space, which is something that we teach in leadership a lot in this. We try to get our students to do it, but oftentimes we don’t do it ourselves. We’re just trying to grind, but we forget why—the why of the grinding and the why of all of it. But AI gives us an opportunity to get really clear about that. And that’s scary for a lot of folks. That’s a little scary for me. And so I think that can also be a barrier for a lot of folks.

Wes Kriesel: And I do, I want to throw out there one minute, an overview of what’s coming this year. And I’m so eager to learn from Santa Ana. But in the last eight to 10 months, what we’ve learned is the silences. So I want to go back to that because I saw somebody put in the chat. We’re learning that we, as County Office of Ed leaders, can hold parent sessions. That’s one of the places where we see districts not fully running forward, Santa Ana being an exception. Go Jerry! And classified staff, we’re doubling down. We have a weekly Zoom call on AI strategies and tools for people who run offices. That’s a silent area. So we’re saying, “How can we add value to what districts are already doing?” We can address the silences, and then we also have, every month, we’re going to hold a teacher roadshow summit for PD around specific subject areas. But that’s districts are already moving in that direction. So I just want to highlight family and classified staff. Those are two areas that we see silences around, and it’s so important to empower families because we believe in—this is a soundbite I got from Kunal—”Take the pressure off the classroom. This is a social revolution. Let’s empower families with AI to have better relationships with each other.” Anyways, I’ll stop.

Nate McClennen: To me, it’s amazing. Really appreciate all those comments. And I think this comment of—I’m looking at the chat and there’s a bunch of things resonating with this: listen to the silence and whether it’s parents or classified, or the student that never talks or whatever the case may be, the one that’s not heard. Here’s an opportunity. And then just this idea of centering humanity. And I think, Kunal, you said it really—it’s just this idea this is a really interesting opportunity for us to rethink what it means to be human and to double down. And Diana in the chat talked about it, put out a link to the survey that Walton just did on engagement, and engagement’s still really low. So this is another opportunity. If we know students learn when they’re engaged, we also know they learn when they feel like they belong and they feel like they’re valued and they have purpose. This is an opportunity to use AI as a tool to get us there. Wes, Kunal, if you—certainly throw some things into the chat. There’s a lot of questions for you. Feel free to answer those. And we’re gonna pivot to Santa Ana and Jerry Menez. And yes, I’d love to hear your thoughts. You’ve got a lot going on in Santa Ana.

Jerry Almendarez: Yeah. Yeah. Thank you. And I apologize for Bianca not being here. She is sick, but she sends her wishes that she could be here. A lot of the work that I’m going to be talking about is work that she’s doing. My name is Jerry Almendarez. I’m the Santa Ana Unified School District superintendent. I’ve been here for about approximately five years. I arrived in 2020, three months before COVID hit, and experienced this once-in-a-lifetime pandemic, and through that time had plenty of time to think about reimagining what education looked like and what leadership looked like. In Santa Ana, we’ve been doing a lot of work. We’re approximately 39,000 students, 53 schools, 6,000 employees. And we spent a lot of time during the pandemic working on our graduate profile, pulling a large number of community members together to have a conversation about what the high-performing educational organizations across the world look like, what their characteristics are, and then compared them back to Santa Ana Unified, which ultimately evolved into our current graduate profile.

And in addition to that, what we realized during the pandemic was that we had to make leadership shifts. We had to change the way we were thinking. We wanted to come out of the pandemic better than we went in. And we quickly discovered that although we thought we were future-ready with one-to-one devices throughout our schools, we were struggling on staff: how to use these devices, how to connect, how to use the Zoom platforms, the various platforms, and really did a lot of reflecting. And that reflection landed us on these five areas for shifting our thinking to approach this new environment in a different way. And here’s the process: the alignment, strategic alignment. We have our board priorities, which we use as leverage to have a conversation about the learner profile. And you can see the six areas there: world-ready scholars, architect of their learning, a collaborative leader, global innovator, empathetic communicator, and a community builder.

And these were conversations and discussions that came out of the current research that was in the 2020-23 timeframe to really help guide our thinking and really to reimagine what teaching and learning looked like in Santa Ana. And so our charge—this is our North Star. And when it comes to the use of AI, we’re approaching it with this Venn diagram. We’re approaching it in three categories. One is through the operational efficiency lens. How do we improve workflows to free up time and to be more accurate? How do we use it to enhance our decision-making, as Wes and Kunal indicated, using it as a thought partner, having it in a room listening to our conversations and helping us develop strategic plans? And then the area that we’re most excited about is the improving teaching and learning experiences. And that’s directly related to how students and teachers use it in the classroom. And so we’ve been spending the past year and a half working on those three areas, trying to navigate how we’re embracing this new innovation.

Future Directions and Final Thoughts

Another study that was done by—it’s the Microsoft Work Trend Index. This came out in 2024. It says—and this reflects the national workforce—75% of people are already using AI in their job, and of that 75%, 46% of them just started using it within the past six months. And so it is definitely being used. Those organizations or those districts that have a tendency to not have a conversation or are silent, like Wes was saying, most likely those employees are already using it. They’re just not telling anybody. And what’s happening is you’re seeing individuals use it on their personal devices because their work devices may be monitored. So it is—it is here to stay. And my concern isn’t so much the students learning it. I think they’re already there and they’re embracing it. My concern is, are the adults in the system learning it? And the adults meaning the superintendents, executive cabinet, because that’s really what’s going to allow permission for the rest of the district to actually embrace things.

The other interesting aspect of it is 66%. This is the employers now. 66% of the employers wouldn’t hire somebody without AI skills. And then 71% would hire somebody with less experience but who has AI skills. And as this report came out and I started to reflect on it, I’m thinking, are we creating opportunities for our students in the classroom that are going to allow them to compete based on this study? And if we’re not, why not? And then, are we giving our teachers the ability to embrace this innovation, this platform, so they can give the kids the opportunities? And I wasn’t confident that we were doing enough in those areas.

We began to continue the conversation. The other thing that we found out was, when I went into classrooms—these are from the World Economic Forum, the trending skills that kids are going to be having or that are out there in the new work environment. As I go into the classrooms and navigate and view what is happening, my reflection is, are kids given opportunities to enhance these skills? And I didn’t see what I was hoping to see, and I knew we had to do something different. The other thing is, are we training teachers and providing the staff development for teachers to really understand and embrace what these new skill sets are? And so a lot of work and a lot of conversation went about the adults. What are we doing to support and train the adults? So we began to explore AI or ChatGPT, and we actually discovered it on TikTok.

I have what I call a large number of reverse mentors, some of them on this session here, who encouraged me to be in spaces that I wouldn’t normally be in. And one of them was TikTok. And the comment was, “If you’re gonna be a superintendent of a large urban district, you need to be—and make decisions for future generations—you need to understand what those generations are doing right now. And one way to get a better understanding is to start to become members of these various social media platforms, TikTok being one of them.” So I actually discovered OpenAI’s release of ChatGPT on TikTok within the first week it came out. And the more I started looking at it, the more curious I became. And I went home, and I started getting on YouTube just to verify what I was seeing was actually real. Then I started to discover that the people that were talking about it, the researchers, were actually very credible. I went on, and I started to subscribe to ChatGPT and started to play around with it and very quickly discovered that this was something unique and something special.

I brought that over the next couple of months to my executive cabinet. And at first, they didn’t want to hear me. They thought I was chasing this shiny object. They didn’t know much about it. They nodded their heads in acceptance but really did not embrace what the platform could do. Then we started—I started to bring in research articles. I started to show them YouTube videos and executive cabinet, and then we started to experiment with it at executive cabinet. And very quickly did they discover the potential and shared it with their executive directors, both certificated and classified. As they shared it with their executive directors, they got excited, shared it with our teachers on assignment, which are class teachers, and our future-ready teachers, and said, “Hey, anybody who’s interested in experimenting or exploring these, let us know, and we’ll open up these platforms for you.” So our teachers that provide the staff development started to experiment and play with it. They said, “Wow, this is great. We need to take it to our teachers, our classroom teachers.”

And so as they facilitated staff development throughout the district a few years ago, they started to use the tools in their staff development. And we received some support and some resistance, but we didn’t push it on anybody. But those teachers that embraced it really discovered how powerful it was. And then as they were using it throughout the weeks and the months, their colleagues started to tap them on the shoulder and said, “Hey, we want to know more about this.” As a result of that, our staff development teachers came back and said, “Okay, Mr. Almendarez, there’s a large number of teachers out there using these AI platforms. We don’t really know which ones they’re using because they’re coming to us saying, ‘Hey, have you seen this? Have you seen that?'” It’s a good time to pull together some sort of AI advisory committee to work on AI guidance. And so a committee of about 60 teachers were pulled together—principals, administrators, certificated and classified. And we went through about a four-month process doing something similar to what Wes did with all of the districts, really having a conversation about what platforms they’re using, how they’re using them, who is using them at the school sites, what concerns they have, and what hopes they have with this new innovation. And so we assigned some of them to go out to experiment and then ask them to come back and give us some feedback.

And as a result of four months of that conversation, we created our AI Compass, and this is a compass that is on our web page that really provides the guardrails for our teachers and opens up the door for more experimentation and exploration of these platforms within the safety net of the district and of the district’s policy. And as a result of this, more and more teachers began to ask questions and wanted to get involved. We began to reach out at this time. We reached out to the Institute for the Future, and I know Sarah is here, so shout out to the Institute for the Future, and asked them to come in and help facilitate some conversations with what we call innovative teachers. And this is a group of volunteer teachers that come together and have a conversation around the framework, the strategic foresight framework, to really reimagine what teaching and learning looks like in the classroom. We’re identifying signals that are out there like blockchain and cryptocurrency, all these different signals that are out there, and really trying to make sense out of them to create multiple futures that we aspire to accomplish. And that’s what this team is currently doing.

Some of the ways that we’re using it, that this team is using it—we talked about student listening sessions. We did the first year, about three years ago, we did a 600-student listening session tour. We toured about 30 schools and we got about 60 hours of audio. It took six to seven months to transcribe that audio and to make sense out of it, to do an analysis. And then we had to take that data, and we used it for the following school year to change and to modify our curriculum. This past couple of years, we’ve been using it. We reduced the number of sites that we visited. We took 30 hours of audio, plugged it into the PR AI platform. It transcribed it for us in minutes, plugged the transcription into ChatGPT, and it did an analysis. And we were able to take that analysis in the current school year and within weeks get it into the teachers’ hands to allow them to pivot. This falls under the operational efficiency and effectiveness.

The biggest lesson learned is that through these listening sessions, like it had been said here, the students feel disconnected and they need more of that human interaction among the adults. And the teachers want just to interact more humanly with the students, but they don’t have the time to because they’re so busy doing all these other compliance issues, these lesson planning and stuff like that. And that just goes all the way up. The principals don’t have the time to get into the classrooms because we’re keeping up. So what we discovered is the embracing of this innovation. Actually, the more you learn it, the more it frees up your time to do those more human things, and that’s the key to embracing this new innovation.

Nate McClennen: Yeah. Amazing. Amazing transformative leadership to think of all the work. I know a number of you on the call today are district people, and if you look at all the work that Santa Ana has done, it can feel intimidating, but it also is very inspiring. So really appreciate the share there. I’m going to—I’m going to put out, if you have questions, please feel free to drop them into the chat of the presentation. There’s a couple that came up that were interesting. I might just call you out. Hopefully, you’re able to jump offline and just talk a little bit when you’re sharing, just a quick minute.

But Tom from Demico, you talked about just building sort of the set of principles. And I want to go back to Jerry. He was talking a lot about using community. And I wonder if Tom, you could come offline and say, how did you build those principles? Were they collectively built? Did you all drive them? How did that work? And see how that compares to what happened in Santa Ana.

Tom Demico: Yeah. Nice to see everyone. Thanks for the presentation. It’s been great. Very similar to what I heard. We brought together an interdepartmental team of 35 staff. It was educators, it was clinicians, it was staff from HR, finance, payroll. And we spent four days visioning how we could use AI and coming up with our final guidance, and that was the start. So we had to come up with those guiding principles first, which we’ve done. We then, this summer, created student principles, so student-friendly language. And every educator in our district, we start up next Tuesday. During the first two days of school, we’ll be launching the student guiding principles and doing a lesson on AI literacy, followed by five days of AI literacy during the semester. So that’s how we came up with ours.

Nate McClennen: That’s great. No, thanks for sharing. And then the other one that came up was this idea of—Diana, you mentioned something about resilience and flexibility and wanting to focus on that. And it goes back to what are we seeing in portraits of graduates and how are we working on those? I wonder if you could come off mute and just talk a little bit about your thinking there. What do you mean by that? And how can we think about that in schools and districts?

Diana Laufenberg: So I guess I’m—I’m somebody who’s been using tech—I started teaching almost 30 years ago, and the entire time that I’ve been at this work, that has been a constant concern of mine. Isn’t how do we get kids to a final place, but how do you get them resilient and flexible enough to respond to whatever is next? And so sometimes when we focus on a specific technology or a specific thing, I struggle to get my head all the way around it because I know we’re not done. I know we’re not all the way there. I know we’re not at a finishing point with AI. We’re not even close to where it’s going. How do we help kids develop, starting at young ages, that isn’t focused on tech necessarily, but again, those human qualities of how do we set kids up—like I’m fascinated by curiosity, questions, inquiry. That’s the—that’s a lane I live in a lot with schools. But how do we create spaces in the K-1, 2, 3 lane that set kids up to have these powerful moments of resiliency, relearning, rethinking, reshaping who they are throughout their lives, rather than thinking that they’re at some ending point or that school is done or there’s some finality to this. So I just ponder that a lot when we talk about new tech.

Kunal Dalal: What an important question, right? And I think—I’ll give you an example of something I do with my four-and-a-half-year-old son. I do a lot of AI stuff with my four-and-a-half-year-old son. And one of the things we do is we do a daily journal. So at the end of every day, I sit with him, and I’m like, “What’s your favorite memory of the day? What’s one of the coolest things you did today?” And so then I have him really describe it, right? Before he used to describe it and I would type it out, but now he can just talk to it. So he just talks to it and then builds out a prompt, asks him, “Was it cold? Was it hot? What was the sky looking like? Where was it?” etc., etc. And he builds out this cartoon image. And of course, it doesn’t look exactly like what happened, but that’s not the point. The point is that he got to create this thing. Then I’m like—he might add something. He’d be like, “Oh, there weren’t birds outside. Let’s put some birds out there too.” Or whatever it might be. “I wish there was a monster truck in the parking lot, so let’s put a monster truck in the parking lot for this memory, just so that we have it.”

And another thing we do is he likes to process his dreams using AI imagery. So he’ll come out, and he’ll be like, “Oh my God, can we do an AI image in the morning?” And then it’s, “Oh, I had this dream of something.” And so we do it. And then again, he gets to create this little thing. And I think the point of that is something Wes and I always talk about is this kind of—sometimes this sounds corny, but I’d say it anyway—that the most important tool in AI is yourself. Wes talks about this a lot. He’s like, “In ed tech, we’ve always had to learn the tool. The more the tool, the better you can be. This is the first tool that—the better it knows us, the better and more helpful it can be to us.” And of course, we’ve got to manage the ethical boundaries or whatever our boundaries are for that. But I think when we think about how to get young kids and how we build out a resilience of learning rather than a tool-based fluency, I think this idea of how AI centers your own curiosity and your own self, that’s not changing. No tool is changing my biology, my biological person. And so I get to be me. These tools are just gonna be collaborators as I go into the world.

Jerry Almendarez: Diana, I think that part of what our experience has been, the journey—we’re far ahead, we’re probably more than a lot of districts in this conversation in Orange County than others, but it’s the journey that we went on. It started small. We just started talking about it. We asked the same question that you just asked with our early learning teachers and our instructors, and it took weeks and sometimes months, really reflecting and trying to figure out what the answers to those questions were. But what’s happened is the hockey stick effect. So we started out flatline and then the enthusiasm has just shot up. And so I think the first step and the best step—because what works for Santa Ana may not work for any other districts, but the best step to take is just to engage your people and to begin the conversations and to answer the questions that your team has, and then take it from there.

Wes Kriesel: Yeah. Just to chime in on that. I think from the poem, there’s a line, “Awake from wonder.” What Kunal said about his dream journal, the daily journal with his son, that’s emblematic of the moments that we want to put ourselves in with the people that we’re exploring and partnering with. So halfway through the 100 conversations in 100 days, we started using AI in the conversation. So we would take the transcript and work with it and prompt engineer together because I started saying, “There is no learning about AI unless you’re using AI. Don’t just sit there and talk about it. You’re making zero progress. You’re probably going backwards. All you’re repeating is things that you know from the past unless you’re awake from wonder.” And then you get this wonderful effect that I think Jerry really hinted at about the journey is you have time dilation, which is in the moment where you’re discovering something new, time seems to slow down. You become more engaged, it becomes more meaningful. People open up their hearts and souls to who they really want to be and who we need them to be in the work and in the organization. And this is true of adults and with students.

Diana Laufenberg: We can do a lot of the flexibility and that without the tech to set kids up for success. That’s what I’m also saying. We don’t have to put very young kids in front of this to set them up for success. Because I think if that’s your starting place with a lot of teachers, you’re going to have difficulty getting to that acceleration that Jerry mentioned. Whereas if you focus on those skill sets that you want out of the kids—the wonder, the curiosity, the creativity—teachers buy into that faster, and you can move towards that world a little bit easier than just focusing on the tool in particular. And so that’s—with 100 plus teachers a year, very specifically on their classroom.

Nate McClennen: Fundamentally, we’re getting down to, over and over again, we’re talking about learner agency. We’re talking about learner-centeredness, talking about learner-centered design, all these things that we know are good for young people to enhance learning. And AI as a tool to a few quotes, just to wrap it up. So Wes, I’ll start with yours—this idea of time dilation. We all know that when we’re in the flow, when things are really running well with students in a classroom, with us learning as adults, when we’re learning together, the maximum amount of learning is going to happen. So I appreciate that. The idea of centering—Kunal, you talked about just over and over again, centering us in our own humanity and what’s most important. So resonating. This idea of paying attention to the silence, Wes, you mentioned this—who is not talking, who is not being heard, who is not represented? And then the last one is this: you are the most important tool in AI. If there is a danger in AI, it will become very complacent. I call it the Wally moment, right? Where those of you have seen the movie WALL-E and seven or eight generations down, and we are complacent humans where things just do things to us. We have to be active in our learning here, and we have to be agents in our own learning. And AI can be a great augmented intelligence along the way. So I want to thank Jerry, Kunal, Wes, all those others who contributed. You all are amazing and doing great work and great resources. Thank you all to those who were in the chat. Enjoy the start of the school year, and I hope that everything goes well with you and your learners.

Getting Smart Staff

The Getting Smart Staff believes in learning out loud and always being an advocate for things that we are excited about. As a result, we write a lot. Do you have a story we should cover? Email [email protected]

Subscribe to Our Podcast

This podcast highlights developing trends in K-12 education, postsecondary and lifelong learning. Each week, Getting Smart team members interview students, leading authors, experts and practitioners in research, tech, entrepreneurship and leadership to bring listeners innovative and actionable strategies in education leadership.

Find us on:

0 Comments

Leave a Comment

Your email address will not be published. All fields are required.