Lately, I’ve been using AI more often. I find that the paid version of ChatGPT can be a more efficient search tool than Google, or at the very least, shows different results. (Speaking of Google, its AI summaries of search results, particularly queries written in question form, have become impossible to ignore.)
AI makes it presence subtly felt in many other ways throughout my workday: from the AI summaries of the transcription software I use for interviews conducted as a journalist and various AI tools offering to rewrite my emails to how I seem to detect its influence in both positive and negative ways in students and others I communicate with online.
Much of this AI use is great and promises to do wonderful things such as offer personalized learning to more students and increase accessibility. Yet, as AI becomes more subtly and not-so subtly integrated into my daily work as a writer and educator, I recently found myself thinking about establishing boundaries with my AI use and better honing when using it is or is not an efficient use of time. I also spoke with two experts on this topic for their perspectives.
AI Shouldn’t Be Used to Replace Tasks That Force Us To Think
It’s easy to say we should never offload our thinking to AI but what that looks like in principle changes.
For example, I recently delivered a talk for the Bethel Connecticut Historical Society on how vodka distilling in the U.S. started in the town. It’s a subject I have written about in-depth and a story I shared with the History Channel show Food That Built America in 2023, but I’ve never presented on it in this manner. I’ve written before about how I hate putting together slideshows and struggle with these in general. The talk also came at a particularly busy time in the semester for me.
I thought AI would be the solution to this problem. However, several attempts at creating a helpful slideshow with various free AI tools failed. These created weak outlines, full of hallucinations, that were visually unappealing.
So reluctantly, I sat down and created my own slideshow, albeit with the help of a few AI-generated images that served the role normally occupied by stock photography. Two and a half hours later, I was glad I did this, not because my presentation was particularly visually appealing — it’s not — but because creating it forced me to sit down and think about what I was going to talk about. It reminded me of details about this piece of history I had forgotten.
In other words, I learned when I was creating the slideshow. Had my initial prompts to AI been successful, I’m convinced I would not have given as effective a lecture. Now I’m more on guard for other instances in which AI might be the easy solution but not the best.
AI Shouldn’t Replace Human Interaction
It may sound obvious that AI shouldn’t take the place of relationship building in education, but as the efficacy of personalized tutors increases, more of us will need to remind ourselves of this.
“Every child needs caring adults who are central to their life,” says Jeremy Roschelle, co-executive director of Learning Sciences Research at Digital Promise. “Don’t use AI to replace the role of a caring adult. Having caring adults in your life is an important part of growing up.”
Educators should also remind students of the importance of peer relationships rather than interactions with AI characters, which are becoming increasingly common, particularly among young people.
“Another really important part of growing up is socialization with other people,” Roschelle says. “And AI-based things can be so nice to be with. They never have a grouchy moment or a down day. They never snap back at you. But as kids, as learners, you need to learn how to deal with people who aren’t always so nice.”
Navigating What Productive AI Use Looks Like For Each of Us
As with many other areas of education, there are few one-size-fits-all solutions to when not to use AI. Adeel Khan, founder and CEO of MagicSchool AI, a tech tool that helps teachers support students, believes it is important that we all learn when AI is helpful and not.
“Sometimes I find myself using it, and I’m like, ‘I offloaded too much of my thinking to AI,’ and this isn’t really what I think, but it was good enough for me to accept because I had to get something done,” Khan says. “Then I look back at it, and I’m like, ‘I wish I didn’t do that, and I’m not going to do that again.’”
Khan adds that through trial and error, he found a sweet spot for when to use AI and when not to for himself, but others need to find the answer to when AI is productive or not on their own.
“Everyone’s answer to that question is going to be a little different based on who they are, the task at hand, their purpose, and their own skills,” he says. “It’s really important that people practice using it and make mistakes so they can calibrate how they can use AI really productively, and also understand how they should limit their usage of it.”
As educators, we should be asking these questions about our own AI use and encouraging our students to ask these types of questions as well.