What Should A Teacher Do When Faced with AI Reluctance?
Key Points
-
Teachers should consider psychological, social, and ethical reasons driving AI reluctance and create inclusive classroom policies that accommodate diverse student approaches to technology.
-
Understanding students’ reasons for rejecting AI tools can aid in fostering meaningful discussions about digital citizenship, workforce readiness, and critical thinking.

There comes a point in every middle-aged person’s life when they make the fateful decision not to master the latest piece of technology or the newest social media platform.
I have a 60-year-old sister-in-law who relies on her children to purchase what she wants from Amazon. One of my 42-year-old colleagues from Korea has never sent a text message; her husband relies on a flip phone devoid of apps. I brought planning for our extended family’s camping trip to a screeching halt when I migrated all the information to a Google doc. The rebellion was led by nieces and nephews in their 20s.
My friends, family, and colleagues are not Luddites. They, like billions of other humans, reach a point of equilibrium in their relationship with technology. To borrow (and slightly abuse) Gandalf, it’s as if they’re saying, “This too shall not pass.”
As a committed technophile, I struggle to understand this choice. I understand even less the decision of students who fear, delay, and sometimes reject the use of generative AI tools that could assist them in the completion of school work.
Over the last six months I have collected a series of articles that tell the stories of students who refuse to use AI to help them with their assignments. I have used AI-powered search to find research that supports the anecdotal evidence. The research shows, surprisingly, that these naysayers include some of our best and brightest.
I want to understand why smart kids refuse to use a widely available and administratively acceptable educational tool that can save them time and effort while greatly reducing the grunt work of school.
Reasons for AI Reluctance
My exploration was inspired by a pre-publication preview of a study led by Elise Silva, Director of Policy Research at the University of Pittsburgh Institute for Cyber Law, Policy, and Security. The study finds that AI is having “significant interpersonal, emotional effects on learning and trust in the classroom,” with students describing feelings of anxiety, confusion, and distrust.
Nature Scientific Reports published in mid-August a study called “The Role of Personality Traits in Predicting Educational Use of Generative AI in Higher Education.” That document states that the three most predictive personality traits of AI adoption are:
- Openness to Experience: Students high in curiosity, imagination, and willingness to try new things are most likely to embrace gen AI.
- Conscientiousness: Students who are diligent and achievement-oriented use AI more for structured academic tasks.
- Extraversion: Social and energetic students lean toward using gen AI, perhaps due to comfort with interactive tools.
The personality trait that received the highest score as a negative predictor is:
- Neuroticism: Significant negative relationship; anxious or emotionally unstable students were less likely to adopt AI, often perceiving it as stressful or threatening.
These insights led to further reading among the popular press and then a dip into the deeper waters of more academic research. I fed my notes into multiple AIs, which synthesized them into three overarching drivers of AI reluctance: 1) psychological and personal barriers; 2) social and family influences; and 3) philosophical and ethical considerations. Within each of these hierarchical categories are a subset of typologies:
Psychological and Personal Barriers
For many reluctant students, the issue is rooted in their perception of self-worth and the very definition of learning.
- Fear of Inauthenticity and “Cheating” the Self: The most common reason is a deeply ingrained sense of academic integrity. For these students, using an AI to generate ideas, outlines, or text feels like a form of cheating — not against the school’s rules, but against themselves. They believe that true learning comes from the struggle and the “aha!” moments of personal discovery. Outsourcing part of that process to an algorithm can feel hollow, like they didn’t truly earn their knowledge or the resulting grade.
- Anxiety Over Atrophy of Skills: There’s a legitimate fear that over-reliance on AI could lead to the degradation of fundamental cognitive skills. Students worry that if they use AI to outline an essay, their ability to structure an argument will weaken. If they use it to draft prose, their own writing voice and vocabulary might stagnate. This isn’t just about a single assignment; it’s about their long-term intellectual development.
- Imposter Syndrome: Using AI can trigger or exacerbate feelings of imposter syndrome. A student might look at an AI-polished paragraph and feel that it’s “too good” to be their own work. This can create a disconnect, leading to anxiety that they are a fraud who will eventually be exposed for not having the skills their work seems to represent.
Social and Familial Influences
The environment outside the classroom plays a significant role in shaping a student’s attitude toward technology and education.
- Parental and Familial Values: Many students are guided by a moral compass instilled by their families. Parents who grew up in an era where “cutting corners” was highly discouraged may have passed on a strong work ethic that views tools like AI with suspicion. The dinner-table conversation might frame AI not as a helpful assistant, but as a shortcut for the lazy or unethical. This familial pressure to achieve success through “hard work” alone can be a powerful deterrent.
- Peer Group Perception: Social dynamics can also play a part. In some friend groups, there might be a stigma associated with using AI. It could be seen as a crutch for those who aren’t “smart enough” to do the work on their own. To maintain their social standing as a capable and intelligent person, a student might avoid AI tools altogether.
Philosophical and Ethical Considerations
Beyond the personal and social, some students are grappling with the bigger picture of AI’s role in society.
- Distrust of AI Technology: Not everyone is comfortable with the rapid advancement of AI. Students who are more critical of “Big Tech” may harbor a general distrust of the algorithms and the companies behind them. Concerns about data privacy, algorithmic bias, and the long-term societal impacts of AI can lead them to a principled reluctance to engage with these systems.
- Religious or Moral Objections: For some, the objection can be rooted in religious or deeply held moral beliefs. Certain belief systems emphasize human consciousness, creativity, and the soul as unique and sacred. From this perspective, using a machine to replicate a fundamentally human act like writing or creating art could be seen as a transgression or a devaluation of human dignity. It touches on existential questions about what it means to be a thinking, creating person.
I had expected religion to be one of the primary drivers of AI reluctance, but its effects are nuanced. If you examine the history of the Luddite movement (1811-1817) you will learn that religion played little to no role in the protests, which were primarily rooted in economic hardship and loss of jobs driven by industrial automation. I could find no firm indication as well that religion plays a dominant role in AI reluctance. Both the Southern Baptist Convention and the Catholic Church have stated that AI should never carry out intrinsically human responsibilities, such as moral judgement, stewardship, or protection of the vulnerable. Neither recommends shunning AI.
What Can Teachers Do?
It is obvious that students who use AI tools to assist them on homework assignments or assessments will receive a significant boost over their AI-reluctant peers. This is true even when teachers create classroom policies that designate assignments as AI-free, AI-assisted, or fully AI-generated.
Because politics has barged through the classroom door, I wonder if teachers even have the moral or legal right to broach the topic of AI reluctance, especially since the primary drivers of this stance are philosophical, ethical, familial, social, psychological, and personal.
Additional reading provided suggestions on how to accommodate AI reluctance. Most experts advised being tool agnostic and outcome focused. That makes sense until you realize that a student who uses AI-powered tools is generally going to produce higher quality work, thus ensuring better outcomes using our current metrics. Quality work is no longer an indicator of mastery or understanding unless, of course, it’s paired with a device-free assessment. Think about a math test for which some students chose to use a calculator and others did not.
Many of the resources I reviewed suggested classroom discussions in which you engage students in dialogue about AI-tool use as a digital citizenship issue, or a workforce readiness issue, a life skills issue, or critical thinking issue. Yes, but …
These discussion topics or classroom policy tactics do nothing to remove the original source of AI reluctance. And why should they? Do you really want to challenge a student’s personal, cultural, or religious beliefs? That’s a debate best left to the courts — or perhaps the afterlife.
Perhaps this is only a tempest in a teacup. If 75, 80, or even 90 percent of the students in your classroom are using AI tools in a permissible and ethical manner to complete some of their assignments, is it really a problem you need to solve? I have long struggled with students who choose not to learn, but that is not the case with AI reluctance. These are students who are choosing not to learn in the most efficient way possible. I have a sister-in-law who notes and tracks her appointments and meetings in a paper day planner. Her choice.
As I have mentioned in prior blogs, AI implementation is so new and changing so rapidly that we are building the airplane while we are flying it and the construction crews are still paving the runway while they pipe electricity into the control tower. This means that the best advice I give you today is likely to be worthless tomorrow.
I make a living man-splaining AI usage to teachers. This time, I have no answers so I’m all ears. Tell me what works for you.

0 Comments
Leave a Comment
Your email address will not be published. All fields are required.