Skip to Content
Categories:

“We Can’t Pretend It Doesn’t Exist”: Lasallian Education in a World of AI

When it comes to deciding what the integration of AI into academics looks like, La Salle has aimed to balance its mission statement and core values with the skills students will need once they leave.
When it comes to deciding what the integration of AI into academics looks like, La Salle has aimed to balance its mission statement and core values with the skills students will need once they leave.
Seychelle Marks-Bienen

On November 30, 2022, the world was thrust in a new direction forever. 

With the release of OpenAI’s ChatGPT, the world of education was forced to confront new and increasingly capable AI tools in the hands of their students. Since then, teachers, staff, and administration at La Salle have been grappling with the use of AI, and how to draw the line between utilizing it as a tool while preserving morality and individuality. 

“Our mission statement says we offer a human and Christian education,” Principal Ms. Alanna O’Brien said. “So the question is, what is that? What does that mean in a world with AI? How do you offer human education when AI is so prevalent in the world and growing pretty rapidly?” 

It means writing new school-wide policies, she explained. It means a new class for freshmen next year, focusing partly on technology. And it means new conversations with teachers, students, and parents, balancing the ethical implications of AI in academics with the reality of the professional world students will graduate into.

But the transition from an environment without the powerful AI tools that now exist to one where they are readily available to students hasn’t been seamless, department heads observed. 

During the August inservice week, author and George Fox University Professor Dr. John Spencer was brought in for faculty professional development and to speak about how La Salle can incorporate AI into education while staying true to our mission statement. Highlighting the skills that can’t be replicated by AI — such as empathy, compassion, and curiosity — Dr. Spencer emphasized that teachers turning away from AI doesn’t teach their students how to utilize it ethically and responsibly.

“We’re aware of the reality that AI is here,” Ms. O’Brien said. “We can try our best to not have it impact how we operate, but that’s not helping students because they’re leaving us to go into a world that has AI.” 

Since then, teachers have been adapting their classes, assignments, and policies to reflect the growing usage of AI, balancing giving students opportunities to use it effectively with ensuring they understand the responsibilities associated with it. Each department, level, and teacher decides how much AI they will allow for assignments, but their individual policies all align with one written by Ms. O’Brien and Ms. Coughran at the beginning of the year in collaboration with the Academic Council.

Found in every class syllabus, it reads: 

The use of artificial intelligence (AI) tools to complete classroom assignments and assessments must come by direction and permission of the teacher. If a student is found to have used AI-generated content and submitted it as their own without permission of the teacher, the student may fail the assignment and be referred to the Vice Principal of Academics. Academic integrity is of critical importance at La Salle Prep. Students who cannot uphold our standards of academic integrity may face serious academic consequences.

Beyond that, though, there is a lot of gray area.

In the English Department, for example, although some teachers don’t allow ChatGPT at all for assignments, students have been utilizing AI since before OpenAI released their chatbot. Software such as autocorrect, spellcheck, and Grammarly support them all the time with writing, and, English Department Chair and English teacher Mr. Paul Dreisbach said, “I don’t think a single teacher in the department has any issue with any of those things.” 

The more complicated issue is whether or not AI such as ChatGPT can be used. 

On the one hand, it can be “tremendously” useful for students who are stuck, Mr. Dreisbach explained, providing a jumpstart for them to begin their own thinking, a tool to see different perspectives, and an aid in brainstorming prewrites for essay development. On the other hand, it can circumvent crucial problem solving skills, prevent growth, and become a stand-in — rather than a stepping stone — for their own thinking.

“Teachers realize that this is a tool that is here and is here to stay, and that you and your generation will be using [it] in your work life and in your daily life,” Mr. Dreisbach said. “It’s not a matter of ‘how do we avoid it?’ or ‘how do we get students not to use it?’ It’s a matter of ‘how do we make use of this effectively and helpfully?’” 

That’s his philosophy: AI is not something teachers can bury their heads in the sand and ignore. 

When dealing with academic integrity and how both their policies and usage will shift moving forward, he — and other department heads — underscored that AI is a tool, one that presents both challenges and opportunities for students and that they will undoubtedly use after they graduate from La Salle. 

Social Studies Department Chair and Social Studies Teacher Mr. Alex Lanaghan compared it to the release of Google.

After Google came out, he said, teachers suddenly had their students looking everything up online, a change similarly startling to the release of AI. And, similarly to ChatGPT or other tools students are provided access too, like the La Salle library databases, teachers had to instruct students on how to actually use it.

“I’m sure people back then thought Google was going to just destroy learning, and now look at us, right?” Mr. Lanaghan said. “We still learn.” 

And 21st century learning, he said, is more than what Google or AI tools can do. It’s not about memorizing or recalling dates and names; it’s analysis, evaluation, and critical thinking. Even when AI is providing the information, students need to be the ones developing and expressing their voice, not simply copying and pasting what AI tools can come up with for them.

“We want to hear your opinions, your emotions,” Mr. Lanaghan said. “We want to hear your experiences, and AI can’t do that.” 

In the Social Studies Department, Mr. Lanaghan said, AI is viewed as an “extension of Google.” For some questions, Google can’t provide specific or nuanced answers, and so both teachers and students have turned to AI when conventional search engines haven’t been adequate. Teachers also utilize it to create rubrics and instructions, along with as a source to improve their own knowledge about a topic they will discuss with students. 

But, as effective as AI tools can be for gathering information, Mr. Lanaghan pointed out that — same as anything provided by Google — they need to be fact-checked. 

According to him, the data AI provides can be incredibly beneficial for jumpstarting students when they are stuck on projects or assignments. However, it has fabricated sources before, so he advises students to use their “common sense” if it ever outputs something nonsensical or hyperbolic, cross-referencing what it tells you with additional, reliable resources. 

“Trust, but verify,” Mr. Lanaghan said. 

His overall goal: that AI can be one of the steps students take in completing a project or assignment, either in brainstorming, research, or in expanding upon their ideas, leaving the finished work a mosaic of everything students have learned in their own words. 

“AI is nothing without the person behind it,” Mr. Lanaghan said. “It’s just like any other internet resource, tool that we have. It has lots of great benefits, but you can’t just let it drive your car by itself — you still want to have a human behind that wheel.” 

When students are pressed for time, he said, AI can often become a shortcut rather than a support. That’s true not only for social studies and English but also for world languages, World Language Department Chair and Spanish teacher Ms. Lisa Moran said. 

Like English, AI tools are a familiar challenge for the World Language Department, she explained. After Google released Google Translate in 2006, language teachers have needed “creativity” to restructure their assignments, Ms. Moran said, ensuring that the majority of projects are done in class and that homework assignments require enough personal thought that students won’t find success by simply plugging them into Google Translate. 

The main difference between Google Translate and ChatGPT is that with chatbots, students can translate much larger texts with more convenience. As the tools have progressed over time, the temptation to use them has grown along with their accuracy and reliability, she observed. 

Overall, Ms. Moran said that AI is “more [of a] struggle than a use,” in the World Language Department.

The main obstacle for language teachers is helping students realize the value of spending ten minutes doing an assignment. Aside from an AI feature on Quizlet that creates practice quizzes for students based on a pre-set vocabulary list, she explained, it’s been difficult to integrate AI into the World Language Department as so much of learning a new language is practice.

In the first levels of Spanish, French, and German, the majority of the class is skill building and learning small bits of information, which are built on as the year progresses. Even if students only use AI for one assignment, by doing that they aren’t practicing the small skills the teacher will be expanding on in the next class period.

Ms. Moran said students often turn to AI when they’re overwhelmed with homework or because of their drive to get a good grade. 

“They’re afraid that if they can’t communicate their information the way that they would say it in English, that that’s not going to be good enough,” Ms. Moran said. “And they equate their inability to do that with not being good at Spanish.”

However, teachers don’t expect students to communicate that way, she said. 

“We’re only looking for what they’re capable of doing,” Ms. Moran said. “We really have tried to teach our students and tried to inform them [of] that.” 

When students use AI, and it gives them sentences in tenses or with vocabulary they can’t explain or recreate, Ms. Moran said it shows they’re using AI just because it was easier and looked better than what they can do. But what she encourages them to do is work within the limitations and parameters they have as non-fluent speakers, and be confident with utilizing the knowledge they already possess. 

That’s when teachers can help them. 

In math, writing, and learning a new language, what’s key for students’ growth is that teachers know where they are at. 

“I want them to show work and make mistakes and ask for help much more than I want them to turn in something that’s perfectly done that they learned nothing from,” Math Department Chair and Math teacher Ms. Kristin Boyle said. “I want my students to know it’s okay to make a mistake.”

Ms. Boyle wants students to attempt homework, and do what they can, even if they make mistakes. If they come to class with questions, she can help them; if they come to class with a piece of homework that’s completely done because they used AI, it’s harder for her to help them, as they don’t know what they’re doing but they’re showing it to her like they do.

For the Math Department, the main AI tool students are using is Photomath. 

When students use this to check their answers, reverse engineer work, or ensure they are on the right track, Ms. Boyle says that’s acceptable use of AI. But when it becomes a replacement for their work — similarly to when students copy off their friend’s homework — it’s not.

“The goal is learning. The goal is understanding. The goal is mastery of concepts,” Ms. Boyle said. “If you’re utilizing something like Photomath just to get a quick answer, then you’re just not going to learn.” 

Despite how new AI is, it’s having ripple effects on students’ learning, some of the department heads expressed. 

Following the school policy, when students plagiarize AI’s work or paraphrase it, and then submit it as their own, that’s considered unacceptable use of AI. For teachers, Mr. Dreisbach said, it’s upsetting when students utilize AI without permission and it’s also noticeable — teachers can tell when work is not indicative of a students’ voice. 

“To me, it’s the same thing that has been happening for years, where students will try to pass off something as their work when it’s not their own work,” Ms. Moran said, which Ms. Boyle echoed by pointing out that the ways students are using AI dishonestly aligns with how they have cheated on homework and tests prior to its release. “They’re not learning it and they’re not trying to understand it. They’re just taking [the AI’s work] because it looked good, because it was what they wanted.” 

For instances of academic dishonesty in the World Language Department, no matter how small, Ms. Moran said the parents are always informed and involved, along with Vice Principal of Academics Ms. Kathleen Coughran. 

“I think it happens a lot more than any of us want to acknowledge,” Mr. Dreisbach said. 

One of Mr. Dreisbach’s students utilized AI as a fill in for their own thinking, typing an assignment question into AI and then paraphrasing what it gave them. When he spoke with the student’s parents, one of them said that they do something similar to that often in their work to streamline everyday, time-consuming processes. 

“What I thought was really revealing was that he said in the end, ‘I know how to do this already, and I could do it on my own. The AI just helps me do it more quicker,’” Mr. Dreisbach said.  “Therein sort of lies the difference, right?”

In Mr. Dreisbach’s opinion, that parent was using AI ethically, as they have credentials and a professional legacy to show they are capable of the work AI is doing.

But students often haven’t shown to teachers they can execute the skill AI is performing for them. 

“At this age and with this timeframe that we have working together, we don’t often know what the students can do,” Mr. Dreisbach said. “And so using the AI seems like a replacement rather than an enhancement.” 

Instances of academic dishonesty with AI are addressed individually, Mr. Dreisbach, Ms. Boyle, and Ms. Moran said. There are discussions beforehand about what is appropriate AI usage for homework, assignments, or essays, and when students fail to adhere to those guidelines, teachers meet with them one-on-one. 

The aim of these conversations, Mr. Dreisbach explained, is to talk to students about what the goal of the assignment was, and how the use of AI has compromised that goal. From there, teachers may work with students to give them an opportunity to demonstrate that they can complete the goal AI accomplished for them.

“It’s always difficult talking to students about that because there’s always the problem or the risk of harming a relationship that’s been built around trust,” Mr. Dreisbach said. “It’s important to have honest, transparent, blunt, direct conversations that are not meant to be critical, but are meant to be productive and proactive and helpful.” 

Students also lose points on assignments, Ms. Moran said, for using AI without permission. If it’s for a test or assessment, they may miss credit on a section or earn a zero for the entire assignment. 

Moving forward, these department heads and Ms. O’Brien said that collaboration between students and staff is crucial in integrating AI effectively into academics at La Salle. 

“I think some teachers are also feeling a loss around how do you teach students to think critically if there’s something out there that shortcuts that process? And so how do we also encourage critical thinking and not turn that process over to AI?” Ms. O’Brien said. “We want our students to graduate from here as thinkers and we want people to graduate having curiosity and creativity.” 

In order to make that happen, she said, education needs to be student centered. 

Ms. O’Brien believes classes should revolve around research questions and student inquiry rather than “transactional” learning where students simply memorize knowledge. That way, teachers can coach students on how to use AI appropriately and effectively from a place of innovation and curiosity. 

While that rests on school policy and teachers evolving their instruction, it also relies on student ownership, she said. Students have to be honest about their goals for after high school and what it will take to achieve them: if they want to go into a field like medicine, for example, she said that they’ll need to deeply understand concepts and be able to talk about them in the moment. 

That understanding rests on their education, Ms. O’Brien said, and the work they put into it. 

“The more students can sit in the driver’s seat of their own education and take responsibility for what they want to do and where they want to become an expert, the better we can integrate something like AI to support them,” Ms. O’Brien said. 

In addition to that, teachers are going to have to learn from their students, Mr. Dreisbach said.

As is often the case, Ms. Moran explained, teachers may be steps behind their students when it comes to knowledge about technology. She said it will be important to ask students when they think of ways to utilize AI positively, as teachers sometimes are stuck in their “trenches” and too busy to look into new ways of doing things. 

According to Mr. Dreisbach, this is a situation where humility is important. 

Teachers have to be willing to say they don’t know something in order to gain insight from someone who does, he said. In light of this, Mr. Dreisbach expressed, further training and dedicated time to learn more with someone well-versed and knowledgeable about AI would be extremely valuable for them.

“We are all I think very aware that this is something that is here, it’s not going anywhere, and we can’t just say ‘nope, can’t use it, don’t use it,’” Mr. Dreisbach said. “We all need to figure out how to use this, not only for ourselves, but also for you all as well.”

Ms. O’Brien knows that this is an ongoing challenge that the staff and students will have to overcome and grow from together.

Coming from a personal standpoint as an avid reader and writer and former English teacher, she faces her own qualms with the rise of AI as a dominant force in the educational world.

“There’s times where I just want it to go away, to be honest,” she said. “But I also know that if I just stop there and live in that attitude, then the world is going to move on without me.” 

More to Discover