It’s no secret that K–12 students have embraced and are actively using artificial intelligence technologies. From ChatGPT to Grammarly, students are using AI to help them write essays, solve math problems and complete homework assignments. In fact, it’s often more common to see students using AI tools in the classroom than their teachers.
Without any universal guidelines for using AI in K–12 schools, combined with the breakneck speed the technology is evolving, educators have largely been left on their own when it comes to determining how and if they should use AI in their classrooms. Teachers, however, can find it difficult to carve out the time and the means to educate themselves on new technologies. While some districts have started to prepare teachers to use it, as Glendale Unified did before the 2023-2024 school year began, other districts have banned AI tools.
New York City Schools, for example, made headlines when it banned ChatGPT shortly after it launched. The district, however, has since rolled back the ban and done a complete 180, launching an Artificial Intelligence Policy Lab. Despite some districts now embracing the technology, as of January 2024, only three states, California, Oregon, and West Virginia, have provided guidance to schools for using AI. Teachers are largely the ones making decisions on AI.
At USC Rossier, faculty aim to prepare students not only to be equipped to use, think about and discuss AI in their classrooms, but to prepare future teachers to meet new technologies—whether it be AI or something else—with a curious and critical mind.
There have been many groundbreaking advancements in education that threatened to upend education, and there will be more to come. Take the advent of the calculator, for example, which Professor of Clinical Education and Engineering Anthony Maddox says was met with fear by some educators who thought students would never learn how to solve equations on their own. Despite these anxieties, calculators didn’t end the need for math class, just as ChatGPT won’t end the need for students to learn how to express their thoughts in written form.
“AI is a technological tool,” says Professor of Clinical Education Corinne Hyde. “Like most other technological tools, it can be used for good, evil or anything in between, and so it’s really on us as educators to figure out when, how, and how much we want AI to take a role in education.”
PREPARING FUTURE TEACHERS
All Master of Arts in Teaching students at USC Rossier are required to take a course called “Blended Learning Experiences for Students in Urban Schools,” in which they consider how to go about designing, implementing and evaluating technology-rich learning environments for K–12 students. Professors Corinne Hyde and Anthony Maddox have both taught the course for a number of years.
Hyde has always encouraged discussions about new technologies in her class, and when ChatGPT emerged in late 2022, she began to think about how to integrate AI more deliberately. The project began with a lot of questions. First off, should students be allowed to use AI for their assignments? If so, what would the parameters be?
“AI can be used for good, evil or anything in between, and so it’s really on us as educators to figure out when, how, and how much we want AI to take a role in education.” —Corinne Hyde, professor of clinical education
To start, Professor Hyde, utilizing resources from the Sentient Syllabus Project, crafted a statement to include in the syllabus about how students can and cannot use artificial intelligence to complete assignments. Her discussions with other faculty landed her squarely of the mind that she wanted to encourage students to use it, “as long as they attributed what was AI-created to the AI, and made it clear how they got that output from the AI.”
Hyde models how to use several AI tools in class: ChatGPT, Teachology and Elicit. The majority of students who use AI to help with assignments are using ChatGPT, and Hyde is careful to warn and demonstrate the shortcomings of ChatGPT to her students. For example, the inaccuracies it produces, its inability to properly cite sources, and in some cases, completely fabricating sources.
While Hyde feels certain that more students will begin to use AI, so far only around 15-20% of her students have disclosed that they’ve used it to complete assignments. Hyde theorizes that students are likely wary of using ChatGPT as the fact-checking required largely erases its time-saving attributes.
Permitting students to use AI when completing assignments may strike fear in some educators who worry students will use it to cheat. Furthermore, “overreliance on generative AI in classrooms and for assignments can at times take away from the students' authenticity, creativity, revision and autonomy; qualities that make up the core aspects of the human experience and the true power of education,” says professor Nooshan Ashtari. Ashtari, who teaches in USC Rossier’s Master of Arts in Teaching - Teaching English to Speakers of Other Languages (MAT-TESOL) program urges teachers to consider the three states of mind that renowned psycholinguist Frank Smith identified: learning, boredom and confusion. When students are either bored or confused, turning to ChatGPT to complete an assignment will seem very appealing.
The goal of the educator, Ashtari believes, is to create class activities, discussions, and assignments where students are truly engaged and learning. “Perhaps some of the more fundamental questions that we need to ask ourselves as educators, lifelong learners and researchers should be to re-examine which states of mind our classes encourage in our students and how we can navigate them more towards the authentic goal of education: true, unadulterated, limitless learning,” Ashtari says.
Professor of Clinical Education Jenifer Crawford, who also teaches in USC Rossier MAT-TESOL program, has integrated AI into her courses as well, and chose to include a statement in her syllabus similar to Hyde’s. AI has particular promise in fields like teaching English to speakers of other languages, so it’s important that students are familiar with it.
Some new AI tools are especially helpful to MAT-TESOL students, as many of them are not native English speakers, Crawford says. Tools like ChatPDF and Litmap Elicit, allow students to engage with assigned readings in helpful and meaningful ways ahead of class, like defining terminology or contextualizing citations. Class can then be reserved for more in-depth discussion of the article. Also notable, Crawford says, are generative AI tools like Elicit, with the ability to identify and synthesize research for in-service teachers quickly and in ways they can easily understand, thus giving teachers the ability to integrate research-backed best practices into their teaching.
One of Crawford’s focuses is “prompt engineering” for tools like ChatGPT. The importance of crafting prompts for the AI to respond to has become an essential skill for anyone who hopes to use these types of technologies effectively, and it’s especially valuable for education students and teachers in the field for creating AI-generated lesson plans. Giving the AI explicit instructions based on a particular class’s makeup, and instruction and learning goals will produce much more useful lesson plans for teachers.
Crawford integrates prompt engineering into her “Introduction to Curriculum Instruction” course. She teaches students “how to use prompt engineering to generate ideas for lesson plans and how to write prompts to get really thoughtful, meaningful feedback that incorporates professional standards and research-based and theory-based instructional frameworks, or frameworks for second language acquisition.”
“Perhaps some of the more fundamental questions that we need to ask ourselves as educators, lifelong learners and researchers should be to re-examine which states of mind our classes encourage in our students and how we can navigate them more towards the authentic goal of education: true, unadulterated, limitless learning.” —Nooshan Ashtari, EdTech researcher and professor
Hyde is also very interested in AI’s potential to take some of the workload off of teachers. She’s explored what types of AI tools her colleagues and her students who are in teaching placements have come across, and includes discussions about those tools in her classes. There are numerous AI programs teachers can use to create lesson plans, scaffolding tools and assessments. Producing these types of materials could take teachers hours, but now AI can spit them out in mere seconds.
But is anything lost when teachers ask ChatGPT to create a lesson plan rather than creating one from scratch? For Hyde, the answer is a definitive no. “Teachers have been using pre-created prepackaged curriculum for decades that they didn’t create,” Hyde says. “They take it, they modify it, they put it into practice. If AI helps to take part of the burden off of that and allows for more customizability for teachers, that’s incredible.”
“A big part of teaching,” Hyde says, “ is knowing how to use the tools at your disposal in order to help someone else understand something that they don’t understand. It’s lighting that spark and helping them to become learners. “Does it matter if I teach that with a lesson plan that I modified from a textbook? Does it matter if I do that with a lesson plan that AI wrote for me? I just don't think that the origin of the things that we use to teach matters nearly as much as whether or not it’s useful to us and our students.”
Crawford agrees. When a teacher uses ChatGPT responsibly, “instead of spending all their time making the lesson plan, they’re reviewing the lesson plan, thinking about how it could be better, and using what they know from culturally responsive, instructional observation protocol.”
Maddox, on the other hand, says students can “use AI to do the entire Blended Learning course if they like. And then,” Maddox says, “the question for that is, ‘Now, what?’”
He stresses students will be doing themselves a huge disservice if they disconnect their own minds from the material and instead allow ChatGPT to engage with the important questions about how to use technology in the classroom—questions they will certainly face as teachers once they get to the classroom. Maddox would prefer students trial methods and explore possibilities in his class as it gives them the incredible advantage of going through a dry run before they have 30 sets of probing eyes on them.
Maddox says that what “I’m hoping, honestly, is at the end of the course there are actually few answers, because there shouldn’t be answers. How students deal with situations where there are no answers, is actually what we’re preparing future teachers for. We want those kinds of agile teachers and thinkers in our classrooms with our children and our youth.”
TO THINK, CRITICALLY
One of the most important things for all graduate students to leave the classroom with when it comes to AI, is the understanding that artificial intelligence, at the stage it is currently in, cannot engage in critical thinking. Crawford explains, “AI doesn’t do critical thinking well insofar as we define critical thinking by the Delphi Report, which is the consensus now, of what are the cognitive skills of critical thinking.” In short, the seminal report identified certain skills as necessary for critical thinking: interpretation, analysis, evaluation, inference, explanation and self-regulation.
When a user inputs a prompt into ChatGPT, “it’s drawing on a data set to predict the language that it thinks we want to hear it say,” Hyde explains. “That’s not the same as it reasoning about what we want it to do, and then giving us an answer. It’s not reasoning, it is using language to speak to us in the way that it thinks we want it to speak to us.”
And because of this, ChatGPT falls well short of critical thinking. For one, “it doesn't really do a great job of connecting to issues of equity and identifying its own implicit algorithm biases,” Crawford says. ChatGPT does not have a “critical consciousness,” something the late educator Paulo Freire described as “reading the world,” or the ability of one to engage and question their political, social and historical situations. This ability, Freire believed, was critical to the creation of a successful democracy.
While ChatGPT isn’t capable of this level of intellect, it can help teachers perform labor-intensive tasks like creating outlines of lesson plans and brainstorming activities. It will get teachers “halfway there,” Crawford says, and then teachers can spend the time gained by editing the framework and adding layers of thought and analysis to create lesson plans that will equitably engage students in ways to bolster their own critical thinking skills.
“AI doesn’t do critical thinking well insofar as we define critical thinking by the Delphi Report .… It doesn’t really do a great job of connecting to issues of equity and identifying its own implicit algorithm biases.” —Jenifer Crawford, professor of clinical education
LEVERAGING TECHNOLOGY
Maddox hopes that when students complete the Blended Learning course, that “they’ll broaden what the definition of technology is to not only accommodate the current state, which is evolving, but future states of technology that will be accelerating at the same time.”
“Technology,” Maddox says, “is leveraging phenomena for useful purposes,” as shared with him by USC Viterbi Dean Yannis Yortsos.
AI is neutral, Hyde believes. “The technology is not going to improve anything for you,” she says. “It’s your use of the technology as an educator that can improve things.”
“How students deal with situations where there are no answers, is actually what we’re preparing future teachers for. We want those kinds of agile teachers and thinkers in our classrooms with our children and youth.” —Anthony Maddox, professor of clinical education and engineering
Hyde hopes her students, soon-to-be-teachers, won’t be afraid of new technologies like AI, but will instead “figure out how they can be useful.” She hopes technological advances, like the one we are experiencing with AI, will be used to “free teachers up to do things of importance, to do things that are innovative. Far too many teachers are just stuck in a cycle of having to do a list of pre-prescribed things because someone else in an office somewhere decided that that was what they needed to do, and it’s not actually what’s useful for the students in front of them.”
When students leave her class, Hyde's greatest goal is that “they’re questioning every assumption about what education should look like. What’s the purpose of it? How do we do it? What are the things that people see as unchangeable, that are actually just things that we’re really used to doing?”
If students leave with these questions front of mind, Hyde feels that she will have done her job.