5 easy steps for responsibly piloting AI and tech in education

Key Points

  • AI has potential to support student learning, educator development, and more – but a thoughtful approach is critical.

  • User input is important but does not give the full picture. Schools should also evaluate the impact of the technology.

  • As new AI models and tools continue to emerge and evolve in education and beyond, users can help shape the landscape to focus on ethical practice and positive impact through intentional piloting and decision making.

By: Alice Waldron

As educators, it can feel like a new AI tool is introduced daily and there is pressure to use new tools just to keep up with the times. What if, instead, we made technology decisions based on ethical considerations and our students’ needs? AI has potential to support student learning, educator development, and more – but a thoughtful approach is critical.

How can schools effectively evaluate the potential impact of new technology without negatively affecting teachers or students? How can we ensure AI is being used to complement and support teachers rather than replace them or add to their workload? How can we confront the drawbacks of AI, such as bias in responses? Here are five steps to help schools pilot AI and other tech tools with purpose and responsibility.

1. Identifying the problem 

Teachers already have more than enough to do, so asking them to adopt any new technology or idea just for the sake of seeming cutting edge is simply not the best use of their time.

The first step is naming the reason you are considering piloting a new innovation in the first place. What is the problem you are trying to solve and why is that problem important? Once you answer those two questions, you can move on to exploring how a new technology or idea might help address this challenge.

RECOMMENDATION FOR CLASSROOM USE: Review qualitative and quantitative data such as student work, classroom observation trends, student surveys or interviews, and assessment or rubric scores to analyze the strengths and needs in your class. Where are students succeeding and having a positive experience? Where are students struggling more? Is there a support need that technology could meet or a barrier it could remove? 

For example, imagine you are a middle school science teacher and you notice that, although your students can follow the steps in a given lab, they struggle with designing lab questions and procedures themselves. You might explore PhET simulations given their open-ended nature, the ability for students to choose and control different variables, and the associated teacher resources available to you. 

2. Ensure ethical use and alignment with core values

Before piloting any new tech tool, program, or idea, make sure it can be used ethically and that you can avoid unintended negative consequences. For example, when considering any technology tools, schools should always ask what data is collected and how it is used. When it comes to AI specifically, schools should consider how the tool presents information – is the potential for bias and inaccurate information acknowledged? Does it come with guidance on how to interpret AI responses with this in mind? What training or support might you have to provide to users at your school site to avoid unintended consequences? Additionally, evaluate compliance with applicable laws and regulations, such as those protecting student privacy – those exist for a reason!

Any new tool should also align with your school’s or organization’s mission. This guarantees that whatever you are piloting will serve your educational goals rather than just introducing novelty for novelty’s sake. At Relay Graduate School of Education, we’re committed to creating a diverse and inclusive institution that recruits talented teachers to the profession and supports them throughout their careers. Any AI tool we pilot must align with our mission. With this in mind, consider the purpose of the technology, and review impact studies or data related to the technology, if available. 

RECOMMENDATION FOR CLASSROOM USE: First and foremost, check with your school’s administration about guidelines and policies for the use of technology, including AI, with your students. Before you make a decision to test a new technology, good questions to ask and discuss at your school site include: 

  • How will the technology support classroom learning and/or experience? 
  • Can all students, including those with disabilities, access the technology?
  • Does the technology require personally identifiable information from students (e.g., a log-in) and if so, how is this information used and protected? Never use technology that requires personally identifiable information without checking with your administration first. 
  • How much does the technology cost? What devices and/or operating systems are required to use the technology? 
  • Is the technology easy to use and likely to engage students? How much instructional time will you need to roll out the technology? 
  • Is the technology free from bias and/or can it be used as a learning opportunity for students to develop the technological literacy skill of identifying and mitigating bias in technology? 

Continuing with the middle school science example above, you might continue with your plan to use PhET simulations if they are aligned to the standards and goals for your curriculum, given that they require no personally identifiable information, are free to use, and include a menu of simulations with inclusive features for accessibility. PhET simulations do not use AI, but if you were exploring an AI-tool, you might consider pairing it with some introductory lessons on AI and ethics for your students: here’s an example from Common Sense Education

3. Make a plan and set clear goals

In my experience, institutions often evaluate technology based solely on user surveys – did the users like the new technology? This data is important, but does not give the full picture. Schools should also evaluate the impact of the technology.  

Take for example one of one of our tech pilots at Relay last year involving classroom teaching practice in a simulated environment. We identified the following three types of goals:

  • Short term: Reduced faculty grading time and student demonstration of target skills in the simulated environment
  • Medium term: Increased student self-efficacy and student demonstration of target skills in their actual classrooms, and 
  • Long term: Positive impact on culture in students’ actual classrooms as measured by rubrics used during observations.

We then collected data on each of these goals over time. We still surveyed our users, but defining our goals for the impact of technology helped us ensure we were making decisions based on what was helpful for teachers and kids in addition to considering the user experience with the technology.

RECOMMENDATION FOR CLASSROOM USE: Align your goals to the initial problem or need you identified in step 1. With the middle school science example, you could set a goal for a student rubric score increase on lab questions and lab procedure portions of your lab rubric. From there, you could identify labs to run using a PhET simulation. 

4. Monitoring progress and making adjustments

Here’s where your user perception data – along with your impact data – comes in. 

TeachFX is an app that records and analyzes classroom interactions and generates AI-powered instructional insights for teachers to reflect on and improve their teaching. We wanted to see how TeachFX could improve the feedback loops for our teacher candidates and save time for our faculty. Students appreciated the instructional feedback and ease of use, but found that long videos were slow to upload. We made an adjustment and reminded students that they could simply record the audio, which uploads quickly. TeachFX was also responsive to the feedback and adjusted how their system responded to video uploads.

We surveyed our students after they used TeachFX for each assignment. After those adjustments were made, the comments about slow uploads disappeared. Responding to data is how you ensure you’re giving the tech the chance to actually work in your context and see if it will have its intended impact.

RECOMMENDATION FOR CLASSROOM USE: Here is where you check in with your students and their learning! Can they use the technology easily? Do they feel like it is supporting their learning and do you notice this impact in their classroom discussions and/or work? 

For our middle school science example, let’s say you modeled how to use a Natural Selection PhET simulation with your students before having them ask an experimental question and design a procedure to answer that question using the simulation. However, while students were working on their own procedures, you noticed they were lacking details on the type of data they would collect to answer their questions. Just like you would with any other lesson, you adjust to your real time data and, in this case, pause to teach a quick mini-lesson on how to read the population graph in the simulation. 

If you are using an AI tool with students in your classroom, monitor the AI-output as well as students’ work and experience with the tool. This allows you to keep a pulse of how well students are able to identify potentially inaccurate or biased information, and adjust your instruction as needed. 

5. It’s time to make a decision

Unfortunately, not every pilot is destined to succeed. Even AI or other tech tools that seem promising at the start might not be adapted or rolled out for a variety of reasons, including budgetary constraints or data suggesting the potential for impact did not translate into real impact. During every pilot, there’s a point where you have to decide what comes next. 

The decision could be a simple “Yes” or “No” in terms of proceeding with a program. You can also simply decide that you need to gather more information before moving forward with full implementation. When we piloted TeachFX during the fall semester we saw a lot of promising data, including a notable increase in student talk time in the classrooms of Relay students. However, we needed more data in order to assess our long-term impact goals (such as observation rubric scores). As a result, we presented our preliminary findings – which included strong data on short and medium term impact goals, as well as positive user experiences – to the full faculty. Faculty who were already piloting TeachFX shared their experiences and encouraged others to join the pilot based on this data. The positive fall results led us to expand our pilot into the spring so we could learn more about long term impact.

RECOMMENDATION FOR CLASSROOM USE: Review the data you collected relative to your goal(s) for the use of the technology. Did it have the intended impact? How did students feel about it? Did any unintended consequences arise? With our middle school science example, you might decide to keep using PhET simulations if student scores on your target lab rubric rows started to increase with use of the simulations. You might also consider factors like how much instructional time you had to spend introducing each simulation, the clarity and accuracy of the conclusions students were able to draw relative to their experimental questions from the simulation, or the impact of the simulation on student engagement in labs. 

A lot of these ideas aren’t new or revolutionary, but we can and should apply them in this new landscape of rapidly growing AI tools. As new AI models and tools continue to emerge and evolve in education and beyond, users can help shape the landscape to focus on ethical practice and positive impact through intentional piloting and decision-making.

Alice Waldron is the Relay Graduate School of Education’s Dean of Clinical Experience.

Guest Author

Getting Smart loves its varied and ranging staff of guest contributors. From edleaders, educators and students to business leaders, tech experts and researchers we are committed to finding diverse voices that highlight the cutting edge of learning.

Discover the latest in learning innovations

Sign up for our weekly newsletter.

0 Comments

Leave a Comment

Your email address will not be published. All fields are required.