Intentional Teaching, a show about teaching in higher education
Intentional Teaching is a podcast aimed at educators to help them develop foundational teaching skills and explore new ideas in teaching. Hosted by educator and author Derek Bruff, the podcast features interviews with educators throughout higher ed. (Intentional Teaching is sponsored by UPCEA, the online and professional education association.)
Intentional Teaching, a show about teaching in higher education
Student-Designed AI Chatbots with Windy Frank and Sarah Gibson
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
When I heard my friend Windy Frank, an adjunct faculty at Lipscomb University here in Nashville, talk about an assignment of hers in which students designed custom AI chatbots, I was very interested. Windy teaches in the College of Bible at Lipscomb, and she asked her students to create AI chatbots based on figures in the Old Testament. Students then engaged their chatbots in conversation, asking the prophet Jonah about his biggest failure or Daniel to make up some names for the lions he famously encountered.
Today on the podcast, I talk with Windy Frank and with Sarah Gibson, faculty fellow for AI and professor of communication at Lipscomb about Windy’s assignment in particular and about Lipscomb’s approach to generative AI more generally. They share what led to the Lipscomb faculty pushing its administration to provide AI tools for the entire campus, and the kinds of objections that students have had to AI-integrated assignments—including religious objections from students at this faith-based university.
Episode Resources
Sarah Gibson’s faculty page, https://lipscomb.edu/directory/gibson-sarah
Sarah Gibson’s website, https://professorgibson.com/
Windy Frank on the Lipscomb University College of Bible and Ministry podcast
Podcast Links:
Intentional Teaching is sponsored by UPCEA, the online and professional education association.
Subscribe to the Intentional Teaching newsletter: https://derekbruff.ck.page/subscribe
Subscribe to Intentional Teaching bonus episodes:
https://www.buzzsprout.com/2069949/supporters/new
Support Intentional Teaching on Patreon: https://www.patreon.com/intentionalteaching
Find me on LinkedIn and Bluesky.
See my website for my "Agile Learning" blog and information about having me speak at your campus or conference.
Welcome to Intentional Teaching, a podcast aimed at educators to help them develop foundational teaching skills and explore new ideas in teaching. I'm your host, Derek Bruff. I hope this podcast helps you be more intentional in how you teach and in how you develop as a teacher over time.
Derek Bruff:For the past year or so, I've been facilitating faculty learning communities exploring the use of custom AI chatbots in higher education. The faculty I've worked with have designed AI course assistants and assignment coaches, tutor bots, and more. The off-the-shelf AI chatbots don't always engage students in helpful ways, and it's been interesting to see what instructor design chatbots can do to support learning. In all of those cases, however, the custom AI chatbots have been designed by faculty. So when I heard my friend Windy Frank, an adjunct faculty member at Lipscomb University here in Nashville, talk about an assignment of hers in which her students designed custom AI chatbots. I was very interested. Windy teaches in the College of Bible at Lipscomb, and she asked her students to create AI chatbots based on figures in the Old Testament. Students then engage their chatbots in conversation, asking, say, the prophet Jonah about his biggest failure, or Daniel to make up some names for the lions he famously encountered.
Derek Bruff:Today on the podcast, I talk with Windy Frank and with Sarah Gibson, Faculty Fellow for AI and professor of communication at Lipscomb University, about Windy's assignment in particular and about Lipscomb's approach to generative AI more broadly. They share what led to the Lipscomb faculty, pushing its administration, to provide AI tools for the entire campus, and the kinds of objections that students have had to AI-integrated assignments, including religious objections from students at this faith-based university. Full disclosure here, my parents went to Lipscomb University and, in fact, met there. I've had several family members work at Lipscomb, including my wife, and I have a few friends on the faculty there, including today's guest, Wendy Frank. So my relationship with this university is a lot different than it is with most universities.
Derek Bruff:With that said, let's go to my conversation about student-designed AI chatbots with Windy Frank and Sarah Gibson.
Derek Bruff:Windy, Sarah, thanks for coming on Intentional Teaching today. I'm very excited to talk to you and uh learn more about your teaching contexts and the ways that you've been teaching with AI. Thanks for being here.
Sarah Gibson:Thanks for having us.
Derek Bruff:I'm gonna start with my usual opening question, um, which is this Can each of you tell us about a time when you realized you wanted to be an educator? And I'll start with you, Sarah, if that's okay.
Sarah Gibson:Um, I'm not sure of an exact time, but I grew up, my parents were faculty members. So I grew up around students. And so my I would say it's not so much I wanted to be an educator, in that I just had a passion for that time of life in higher education, especially the undergrad level and all the changes they're going through. And I grew up around that population, and so it felt very comfortable to then be a part of that journey in a person's life.
Derek Bruff:Wow. I'm actually, I don't know if it's surprising or not, but uh a very high number of my guests mentioned having two parents who are educators as part of their background. So um sometimes my guests have veered away from education because of that reason.
Sarah Gibson:I think I tried to do that and I here I am, and I ended up in it anyways, but I really did realize, man, I I love guiding people through that moment um as they become adults and start to think about what their life is gonna look like.
Derek Bruff:Yeah, yeah, I love that. How about you, Windy?
Windy Frank:Um, I am from the veered away from education camp. Uh my mom was a teacher. Um, I have several of my aunts are teachers. Uh my great-grandfather and great-grandmother were the principal and teacher of a school. Um, so for me, I uh I was like, I don't want to do that. And I think there was some gendered things there, like that women can be teachers and nurses, the end. Um, and so I wanted to do something else. And so I ended up in um campus ministry because I really like college students. They're fun to hang around. Um, 19-year-olds have brains that are broken in interesting ways. Um, and so they're uh I enjoyed the teaching aspect in ministry, and then I kind of fell into teaching at the college level um because I had a a friend who needed a favor who was a dean at a Christian university, and he's like, Hey, I'm taking on this extra position. I just need somebody to cover this one class for me. Can you do me a favor and do it? And so I covered that one class and I was hooked. Um, I really enjoyed uh the teaching aspect of it, and um especially teaching Bible classes with students who have wildly different levels of biblical literacy, um, and finding a way to teach them where everybody is learning something was really fun and challenging for me. So that's how I kind of fell into it.
Derek Bruff:Yeah, yeah.
Windy Frank:And fulfilling my destiny as an educator with all of the rest of my family.
Derek Bruff:It was unavoidable. Um well, and that eventually brought you to Lipscomb. And so um, could could could you tell us a little bit about the courses that you're teaching now at Lipscomb and and who are the students? Why are they taking these courses?
Windy Frank:I am teaching, I teach primarily the kind of intro Bible classes. Um, we have three classes that every student has to take uh story of Israel, story of Jesus, and story of the church. And so I have taught uh story of Israel and story of the church. And then um the spring I'm teaching uh a class called Faith and Culture of kind of seeing how those those two ideas interact. And then so the students I have, most of our Bible majors, Bible ministry majors, end up in a cohort. Um, so they're all taking the story classes together. So I have everybody who's not a Bible major. Um and that's really a lot of fun for me. I really enjoy that. Um, and again, I have kids who are coming from backgrounds where they may have been like, you know, Bible bowl champs in their youth group. Um and I have kids who literally have never opened a Bible before. Um, and so finding ways for it to be interesting and surprising for all of those different populations continues to be a lot of fun for me.
Derek Bruff:Yeah, yeah. That's yeah, the uh it's the non-major challenge, right? Um but but one with a particular uh uh lens to it through through um yeah, Bible studies. Okay. What about what about you, Sarah? What courses are you teaching at Lipscomb and and who are your students?
Sarah Gibson:Yeah, I am working with a little bit more discipline specific than Wendy has, although I have had a background in working with those um students in the intro-to-com kind of speech type class. But my background is documentary filmmaking and new media storytelling. So currently I am teaching things like generative AI storytelling, or um the one that I consistently teach is communication law and ethics, which is a lot of fun. And recently they really love to dive into the ethics of AI. And so that kind of brings my two passions for AI and this idea of law together. Um, and so that has been a very good fit for me currently.
Derek Bruff:Okay. Um, so you you brought up AI, and that's where I want to go with our conversation here today. Um and um, Sarah, maybe I'll start with you. You just identified AI as a passion of yours. Presumably that's not something you would have said five years ago. No. So what what led you uh to kind of focus your career in this area over the last several years?
Sarah Gibson:Yeah, that's a great question. Um, because it is not, it is not the path that I had foreseen for myself. But my parents, as I said, were educators. One was a lawyer, so there's the calm law piece. The other one was a musician, but she was really in, my mother was really into digital music. And so she was always doing cutting-edge technology stuff with music. She may be one of the first people that took a synthesizer and connected it to a Mac computer. So I grew up with Macintosh in my house with technology. She taught me to code websites before Yahoo and Google existed. I had my own web page. So this idea of technology being a part of your life and not being something to be frightened of, but really something to embrace is my upbringing. It's everything about me. And then storytelling is another passion for me. So when I found AI, it really was bringing these two things together. I can now tell stories in interesting ways with AI. And um that kind of spurred the entire passion. And I think I fell down the rabbit hole, the AI rabbit hole, and now here I am enjoying and loving every minute of it and really enjoying the passion of helping educators understand um where AI is going to exist in the classroom.
Derek Bruff:Uh wow. Okay. Yes. Um, I was also coding websites in HTML. Um, I think there was a Yahoo, but it was it was the old Yahoo that just tried to list all the websites.
Sarah Gibson:Uh yes, yes, back in the day. Yes. I I was uh I was a sad middle schooler who nobody I was nobody will come to my website. Why is nobody visiting my website? Well, um I mean no one's on the web right now.
Derek Bruff:Right. That's the first problem. There's not an audience yet. Yeah. Okay. But always seeing technology as a tool, but maybe even a tool for creativity, for storytelling. Um yeah. Okay, so I'm gonna come back to that and how AI fits into that. But um Windy, one of the reasons I I I have I'm having you both on the podcast is that we know each other in real life. Um, and you were telling me about this AI assignment that you had last fall in one of your courses. And so I'd like to talk a little bit kind of concretely about what AI looks like in your teaching. And so, Windy, I'll start with you. Can you can you tell us about this um chatbot assignment that you had?
Windy Frank:Yes. Um, so part of my approach to teaching the Bible, especially at intro levels, is to um help students connect to the story and um to help them emotion is just tied into story. I don't know if there's a way to tell a story without emotion. Uh it's interesting what you were saying, Sarah, about how your storytelling was kind of the impetus for this. So I use narrative theology um a lot, and I also try to help them understand that the world of the Bible, especially the Old Testament, um, is not our world, um, is very different than where we live. And so helping them understand this distance um as a way of uh maybe getting past kind of that initial um surface reading, which is fine. There is there's truth to be found with that, but knowing some of the historical background and the um intention of the original authors to the original audiences, I think is helpful. So all of that sounds boring, but story of um how did this person feel in this moment, um I think is really is a really helpful lens. Um and also I don't have to get into the weeds of the historicity of um what year did Moses lead the people out of Egypt and can say, look, these stories have been important enough to write down at great cost and effort and pass down for millennia. So the way they are told is important. Um, so how can we study the way they're told?
Windy Frank:So the assignment um that I did because uh Lipscomb has given us uh access to uh fantastic tools with AI and fantastic training, like with the Center for Teaching and Learning, that Sarah has done all these great workshops. Um so I was thinking through how I could use, especially the feature of creating a chat bot um for my Old Testament students. And I uh gave them a group assignment where they had to pick a figure from the Old Testament and um research that figure. They're what is what we know about them from the text, as well as what some of the uh social historical background around that time period. And so that was uh I kind of scaffolded it. So the the first part of it, they had to pick a character. Um, and then we did some exercises of how to research and how to tell if your source is valid. Because especially when we're talking about the Bible, um there are lots of people on the internet with so many opinions uh about the Bible, and not all of them maybe have the training. Like I'm fine with you telling me how the story of Moses makes you feel on your blog or whatever, um, but don't start making claims about you know the year that this happened um when no. Uh so the idea of of vetting sources was helpful. Um so they could use uh traditional sources from the library, they could use databases, um, but I also let them use uh Perplexity. Um because it would give them the footnotes of where they're getting it.
Derek Bruff:So quick sidebar, that is an AI tool. Can you say just a word or two about that for listeners who may not know what that is?
Windy Frank:Sure. And Sarah can probably give a lot better details here. Um but it will you can do a query and then it will tell you the answer, but it will give you basically footnotes of where it's getting this information versus you know the other stuff that the other models that just kind of ate the internet, they're compiling it and giving you a summary. Um, Perplexity will give you where it's coming from. And sometimes you can click on the links that it's saying this is where it's coming from, and meh. Um they may not be the best sources, but that was part of the the exercise for the students is evaluating that. Yeah. Um so they had to do all the research, and then they put that research into the knowledge bank for their bot. Um, and then I told them to make it answer in first person. Um, and then they uh were working with it, and the next step of the assignment was they had to try to break their bot. So um,
Derek Bruff:let me jump in. So the so give me an example or two. Like they would create a bot in the voice of the
Windy Frank:um in the voice of the prophet Jonah.
Derek Bruff:Okay.
Windy Frank:Um so that was uh one of my the students that really connected kind of to the emotion of it was the ones who were doing Jonah, because they fully read the book of Jonah, which I don't know if any of them had done before. Um so winning there. Yeah. Uh and then they were like, man, Jonah's really grumpy. And I'm like, yes, that is accurate to the story. Jonah is very grumpy. Um, so they decided to code their bot as as grumpy. That was part of they they wrote like a personality for Jonah. Um, and their Jonah bot is very salty, it's hilarious. Um one of the uh test questions that they had to answer, they had the the character has had to answer um, what's your biggest success? What's your biggest failure? And tell me about your relationship with God. Um, just to kind of test it. And then they could come up with their own questions that they wanted to like kind of show off what their bot could do. Um, so this Jonah bot, uh, if you asked it what its biggest failure was, about 50% of the time it would say preaching to Nineveh, and they repented was its biggest failure. Um, because, and I was like, again, accurate to the character that Jonah was not happy to be preaching to Nineveh. He did not want them to repent. He wanted God to come and smash them all. They were the enemies of Israel. Um, so it was really funny that the this they had coded it in such a way that it would see the thing that we know about Jonah, other than the big fish, um, of Nineveh repenting, of him seeing that as his biggest failure.
Derek Bruff:Wow. Okay. Um, and so the students had to do this background research. They had to um, it sounds like they had some documents they would load into the knowledge base of the chatbot. So typically these AI chatbots have a set of like a system instruction, a set of instructions you give the chatbot that tries to govern how it behaves. And then you can also give it access to some documents as part of its knowledge base. So they had to kind of design all of that, and then they had an interaction with the chat bot with some questions from you, but questions that they also added. So I want to hear more about your assignment and your students' reactions to it, but I want to turn to Sarah here for a minute. Um, you work with a lot of faculty around using AI in thoughtful ways. What do you see in Windy's assignment that you think are is is interesting or novel or even just like uh an effective practice for using AI?
Sarah Gibson:Yeah, I think a lot of times faculty think that um using AI is going to allow the student to shortcut the experience. And so if you look at her assignment and you look at what she did, it's not shortcutting the experience, it's actually allowing them to go deeper into that experience rather than just read about Jonah. Now I have to understand a little bit more about Jonah, and I've got to get this bot to react in that way. And they're starting to critique um what the bot instructions are. Um, and then I bet those students will always remember the story of Jonah. Um, I bet that will be imprinted on them for the rest of their lives because of that experience. Um, and so it's a great example of how AI, when used effectively, will not shortcut the education experience. And I say when used effectively, because if you just use it, write me an essay, all the bad prompting and bad AI use that we are seeing, that will shortcut the experience. Um, and so part of the role of educators in the next several years will be to sell students on why it's important to not shortcut the experience at certain times and then to use AI to deepen the learning at other times.
Sarah Gibson:Um that's something we're thinking about at Lipscomb already. We don't talk about AI literacy. Um, a lot of edge, uh a lot of institutions are talking about AI literacy. We talk about when AI is a part of everything that we do, like the cell phone that you have in your hand, what do we want to protect? Um, and what are the skills that we think are essential that our students need to have? And I was just in a meeting yesterday where we were making a list of all those skills. What we want to do instead of focusing on AI literacy is we want to focus on protecting those skills while also embracing AI. Um, and so it's a totally different approach than you're seeing at other institutions. But I think it's gonna set us up in the long run to really um provide an ecosystem for the students to learn in where they can start to pick and choose almost their own adventure in certain ways. You see that in Windy's assignment as well, where they get to dive deep into a particular character and they get to have some, I'm assuming they may have had some agency over that choice. Yeah.
Derek Bruff:Okay. Windy, did you did you say you had a kind of class presentation piece to your assignment?
Windy Frank:We did. Um, so they had a presentation at the end of the semester and they um kind of outlined what their research process was. How did they find their information? How did they work together as a group? Did they assign pieces or did they all meet together? Um, and then their bots had to answer the the three questions that I had given them live as a projected on the screen. Um and then they could come up with a couple of questions that they thought were showing off the work that they did, and they would say, We're gonna ask it this question for these reasons because we want to show its strength in emotional reasoning or something like that. Wow. Um and then their classmates watching them could ask their bot questions, and those were real funny. Um, so we had a lot of fun with that part of it. Yeah.
Derek Bruff:Yeah.
Windy Frank:And I think somebody asked Daniel, a Daniel bot to name the lions in the den with him. Um and it came back with like Hebrew names that meant like God is with me and things like that. It was so funny.
Derek Bruff:Wow. Okay. So like not a bad answer, actually.
Windy Frank:Yeah, actually. We're like, oh, that's that's a decent. I thought it was gonna go with like fluffy or something, you know?
Derek Bruff:Right. Okay. Oh, so fascinating. So it's and I mean, part of what's happening, I think, is that the the AI, um, this is not my language, but it's it's a kind of provocation for the students, right? The work is what the students are doing, analyzing the historical sources and making sense of what's there. The AI becomes a kind of uh a tool or uh almost a game to play to give them an excuse to do that kind of thinking. So you were, you know, you're teaching Bible courses, you're not teaching AI coding.
Windy Frank:Yes.
Derek Bruff:Like, like, how do you feel about navigating that line? Like they need to know enough to kind of play with the tool, but you're not trying to turn them into master prompt engineers.
Windy Frank:Exactly. Yeah, um, it was it was interesting because I did, I was surprised that I had a couple of students that kind of push back uh against um using AI for various reasons. Um, some of them uh had an environmental objection that is is this worth the electricity to do this? Um some of them had a moral objection to AI in general, uh you know, it's uh it's stealing information from all of these original creators. And so is it ethical for us to use it even for a project like this? Um and I had some that objected to it before making it speak in the first person from a biblical character's perspective uh that felt icky to them, um, like borderline blasphemous to to have an answer as Moses or answer as Abraham or something like that. Um I probably would not do the same project in like a story of Jesus class. I would not have them build a Jesus bot. Okay. Um, because that that would feel like it was crossing a line to me. Of uh, I don't know if you could possibly program something in such a way that it would answer as the son of God. That that feels like crossing a line.
Derek Bruff:Yeah.
Windy Frank:Um, so yeah.
Derek Bruff:Wow. Okay. Yeah, that is not an objection I have I have heard before before your situation from students.
Windy Frank:I would not have imagined it. So it was it was interesting to to navigate.
Derek Bruff:Yeah, I I do wonder how those students would feel about the Veggie Tales cartoons where various Old Testament characters are depicted as talking vegetables.
Windy Frank:Yes. And I I asked them, like, okay, would it be okay if you were writing a skit and you were portraying this person in the first person? They were like, of course that would be okay. And I'm like, Oh, okay.
Derek Bruff:All right, all right. Yeah, well, I mean, and there is something about AI that kind of creates these um emotional reactions in people. I find this quite often, sometimes very positive, but often very negative. Like, I don't know if it's the Uncanny Valley thing, that it's almost human, but not quite. Um, but um, that's fascinating. So I I did want to ask, and I'm gonna, I I think this is where Sarah might come back into the story a little bit. Um, Wendy, it's my understanding that because of some of these objections, you had to do some kind of policy work at the university. Yeah. Yeah. How did that play out?
Windy Frank:Because yeah, we this is a gen ed, um, and this was a required assignment. And so we had to think through how do you deal with the students' objections to using AI in a required assignment because a lot of our courses that aren't programming courses are using these um sort of assignments. So I uh, you know, had a conversation with with Sarah and I think um the provost uh of this this email chain going. And so I did create an alternate assignment for the students that objected, and they could um just do like traditional research uh at the library with paper books if they wanted to. Um and then they didn't have to be in in all of the the students who had objections, they were on a team or in a group with students who did not have the same objections. So they could work together as a team and say, okay, I understand that you have an objection to AI in these ways. So can you do the majority of the research for us in the library? And that way you don't have to be part of the writing the bot or doing the presentation in class or interacting with the bot. Um so they they were able to uh navigate that on their own as well. So I gave them some options um so they and then you know was working with them individually and say, is this a good solution? Do you feel like this is, you know, respecting your objections um while still meeting the objectives for the assignment and objectives for the class?
Derek Bruff:Right. Because I mean, again, working with AI was not on your list of learning objectives, right? Yeah. So it was just a tool to get them to these other things. And so there's other ways to get them to those same learning objectives, um, perhaps that don't involve AI. Sarah, can you speak to this challenge a little more broadly at Lipscomb, given kind of Lipscomb's movement into AI in these ways?
Sarah Gibson:Yeah, I'll I'll set up a little bit of context first, is that um we became the first independent university to universally adopt an AI tool, which means that we gave an AI tool to all faculty, staff, and student and students. And that happened July 10th. We did not know if that was going to happen. Uh, it's very expensive, it's a huge investment. Um, and so all the work that we did with our faculty last year around being fearful about AI, we intentionally did not do with students because we didn't want to offer something that we didn't have. Um, and I think we all were a little surprised. Um, but it makes sense looking back that when they left in May, we were Lipscomb. And when they came back in August, we were an AI forward university, which was different than what they had expected. Um, and then all of a sudden, all these gen ed courses have AI assignments in them.
Sarah Gibson:Um, and what we had not anticipated was that the work that we did with the faculty of grieving the world that is leaving us and living in this liminal space to the world that will become after AI, we did not do with students. Um and the students now needed to do that work, that grieving. Um, what we have learned since then, since uh Windy was kind of the canary in the coal mine for us and and brought us some of the most interesting um uh objections. What we have learned is that um those students will actually move from fearful to um embracing faster than faculty will, but it takes very intentional and very thoughtful conversations. And those conversations have to come from faculty. So it has to be on the ground with them. It can't, we can't just have a big meeting with all students and bring them alongside. The faculty are the ones that are gonna have to do the work.
Sarah Gibson:So we are in the process now of um giving better resources to our faculty to think about having those conversations with their students and then also sharing that story with other universities and telling them that if you are looking towards universal adoption, start to engage your student government, start to engage some student organizations of your high-flying honor students to be a part of that process so they have buy-in as well, because our students didn't have a choice and they didn't have buy-in. And I and our conversation now is at what point will we say we are an AI forward university and you are going to participate with AI regardless of objections? Um, is it one year or two years down the road? At some point that will happen. Um and the question is just amongst us right now, thanks to Windy, is um when does that happen?
Derek Bruff:Wow. Okay. That that's gonna strike some of my listeners as a little radical.
Sarah Gibson:I I I think so, yes.
Derek Bruff:Yeah. Because, you know, on the one hand, we don't say to students, I'm sorry about your moral objection to email. Right. But you're gonna be provisioned with an email account and you need to check it regularly as part of your work here. As a like, like like we don't we don't give them an opt-out for email. Um, but again, people don't have this visceral reaction to email that we've been talking about.
Sarah Gibson:Um it's it's gonna be a big question. And the reason that we think that at some point we're gonna have to draw that line, and I'm not gonna say Lipscomb's drawing that line tomorrow, that is not what's happening. The question is when does it? Um, is because of employers. Uh, employers, um, more and more we're hearing from them that they're saying if your students are AI resistant, they won't even touch AI when they come out. It's a non-starter. I can't work with them. Um, and so as the the industry and as employers start to push forward with AI, what we call, you know, misuse of AI, the business world calls um progress. That's a Jose Bowen quote, I think, from Teaching with AI. Um, as that happens, we're at some point going to have to make a decision, and every institution is gonna have to make a decision. Do you continue to let them opt out? Or do you say, no, this is what we do? Um, and um you're very clear, I think you have to be very clear with them up front that that's what you're gonna do.
Derek Bruff:So you mentioned, you know, Lipcomb has adopted an AI tool. And I don't think we've named it, but I I know one of the tools you've adopted is Boodle Box.
Sarah Gibson:That is the the one tool that is Boodle Box, yes.
Derek Bruff:So all students, all faculty, all staff have have access to
Sarah Gibson:Yes, and Boodle Box is basically an aggregator. It brings all the AI systems together. So within that Boodle Box platform, they're getting Chat GTP, Gemini, Claude, Perplexity, um, tons of image generation, in image generators, nano banana pro, you name it, it's probably in Boodle Box. Um the it's like a walled garden and it's very safe and very secure. So it also, um going back to Wendy and one of the objections the students had, it is surprisingly, it is energy efficient. So it uses up to 90, 95% less tokens than the models themselves. So they have a proprietary way in which they do that. Um and it doesn't um use as much energy. So when you go to Boodle and you use Chat GTP, you're using less energy than when you go to Chat GTP itself.
Derek Bruff:Wow. Okay. So um a lot of the concerns I hear about AI are coming from faculty. Um what have some of like have have your faculty colleagues expressed concerns about this uh university-wide adoption? And how do you respond to some of those concerns?
Sarah Gibson:I would be interested to see if Windy's heard stuff because sometimes people don't want to come talk to me and they're like, Sarah's the AI person. I will say that um from what I know, and then I'll I'm really curious to see what Windy thinks. Um a year ago, so not this past August, but the August before, um August 2024, there was a ton of resistance. What you're seeing at other institutions, Lipscomb was there. Um and through a series of um very strategic moves, reaching in the dark, doing some things wrong, some things right, by February. So that's a few months, the student, the faculty said, This is what we have to do. Wow. And it that movement became grassroots. And so that's another thing that we love to say is we are the first grassroots movement, where the faculty went to the administration, put pressure on them, and said, we need a tool because we need something that everyone um we have equity and everyone has access to rather than just one population.
Sarah Gibson:And um we have been surprised at uh what a difference that did make. Um, even reflecting on it, we were reflecting on it yesterday. Uh it made a bigger difference than we thought when we're on the same platform and we can have the same conversation about what we're doing. Um, but that objection turned quickly towards acceptance, not because they they were like, oh, I love AI, and not because they thought I want more work to learn how to do AI. But it was a care for the students. Um, and that's what I think is so special about what's happening at Lipscomb is they realized I need to do this work and I need to allow this to happen because I care for my students so deeply, and they're going into a world where those skills are necessary. And I get chills every time I talk about it because just hearing the faculty talk about that care. Um, and you're seeing people at the end of their career, beginning of the career, mid-career, all across disciplines who are like, I have to figure out how to embrace this appropriately so that I can give my students what they need in this world. And they also realize that we have a chance to shape a generation that's going to reshape society. So as AI becomes a part of everything that we do, society will be reshaped. And we hope that society is reshaped differently than after social media.
Derek Bruff:Right.
Sarah Gibson:So with social media, we thought we would be more connected and we're less connected. We hope that AI will make us more connected, but it will only do that if society stands up and says, people first, human-centered AI only. And what an opportunity we have, Wendy, right, to work with these students who will be the ones who can say human first, the human comes first, not the AI. Um, and so they really realize I've got to adopt it now so that we can start to set that tone.
Windy Frank:From our position as a Christian institution, um, for me, I know that my students are going to go into the world and be teachers and nurses and animators and do all of these different things. And I we have a lot of emphasis on thinking about your job as vocation of how can you connect the good world that God has made and the story of God's redemption of this good world into whatever job you're doing. And so I want these students who are Christians to be innovative and be conversant with AI, with uh these different conversations. And so sheltering them away from it, I don't think is the answer. Um, I don't want it only to be the students who are at secular universities who are leading the charge and leading the field with how to think through and use AI. I I want Christians to be part of that conversation too. Um, and so I I want my students who won't necessarily the the ones who are um you know not Bible majors, they're not gonna be the the pastor or the youth minister, but they're gonna be the Sunday school teacher and the VBS leader. Um and I want them to be able to think through the ethics and the implications and the how can we use this well, what is the line, um, and actually know what it is and how it works so they can kind of push back against the boogeyman of AI that I think is is kind of prevalent in in the zeitgeist right now.
Derek Bruff:Yeah. Well, we're short on time, but I want to hear very briefly, Windy, are uh what's next for you? Are you gonna run this assignment again in your time?
Windy Frank:I think I am gonna run the assignment again um because of the paradoxically, the way that the robot helped them connect emotionally to these characters. Um, I had a student that said, man, this robot's making me feel stuff. Um, and that might be the title of a paper in the future. But for them, for to go from the perspective of when we read the book of Ruth, we know the ending. We know that it has a happy ending and everything gets better. But to ask the Ruth character what was it like when your husband died in Moab and hear her talk about it was scary and it was risky and I didn't know what I was gonna do next, like they hadn't thought of that before because they had only seen the happy ending. So for them to be able to connect with the emotion of it, um, I think is worth doing again.
Windy Frank:Um, and then I'm thinking through how to use it in my faith and culture class this semester. Um so one of the some of the feedback I got the last time I taught this course was uh they wanted more examples of um contemporary issues and how to interact with them from a faith perspective. So this is I'm workshopping this, we'll see how it works out. Uh but right now I'm considering um giving them a choice of different uh issues like social media's effect on mental health. Um and then they're going to choose four different perspectives of different sorts of people who could have different opinions on this and um do some actual research on how would a a boomer pastor who is a tech skeptic feel about this. Um and then so they'll they'll write the uh the questions so the AI is answering from that perspective. And I'm hoping that it helps them to uh grow in the ability to see something through someone else's eyes. Um and so they can have a very, you know, uh strong and uh opinion that is supported by all sorts of uh deep convictions, but for them to be able to step back just a little bit from that and say, can I understand why someone coming with the opposite conclusion on this issue, how they got to that place um and how they're seeing it. So I'm hoping that it will be a way to model um having conversations and seeing someone who disagrees with you not as the enemy or not as evil, but as someone who just has different life experiences than you do and has come to different conclusions. So we'll see how that goes. Yeah.
Derek Bruff:Yeah. That's a great goal. Yeah. Yeah. Well, and I'm sure I'll get to check in with you and see how that goes. Um, Sarah, anything from you about what's next in your AI journey?
Sarah Gibson:Um, I think I just as I used to have a passion for helping students transition um in that time, I am developing a passion for helping faculty do the same. So helping faculty um think more about AI and what it means and what it means in the future. And so I'm really gonna start leaning in more towards helping people like Wendy do their job and do it and hold on to the things they care about while also embracing AI and really wrestling with what that looks like. I love the liminal space. I live in that space. Um, and so I'm looking forward to helping more educators at Lipscomb and other places navigate that space in the next five to ten years.
Derek Bruff:That's great. That's great. Well, um, as Windy noted from her colleagues, it is a lot of work, right? We have to change a lot of things about our teaching, and so I appreciate having folks like you out there to help help people do that work. Um, thank you both for being here today and for sharing uh your experiences and perspectives. This has been really delightful. Thanks for being on the show.
Sarah Gibson:Thank you. It's been fun. Thanks for having us.
Derek Bruff:That was Windy Frank, adjunct faculty in the College of Bible at Lipscomb University. And Sarah Gibson, Faculty Fellow for AI and professor of communication at Lipscomb. Thanks to Windy and Sarah for coming on and sharing their experiences with AI Aware Teaching. I'm excited to hear how Windy's continuing experiments with AI chatbots this spring turn out. Maybe I'll ask her back on the show after the semester ends to find out. And as Sarah said near the end of our conversation, she's doing a lot of work helping other faculty navigate the impact of generative AI on their teaching. If you'd like to learn more about her work in this area, see the show notes for some links.
Derek Bruff:Intentional Teaching is sponsored by UPCEA, the Online and Professional Education Association. In the show notes, you'll find a link to the UPCEA website where you can find out about their research, networking opportunities, and professional development offerings. This episode of Intentional Teaching was produced and edited by me, Derek Bruff. See the show notes for links to my website and socials, and to the Intentional Teaching newsletter, which goes out most weeks on Thursday or Friday. If you found this or any episode of Intentional Teaching useful, would you consider sharing it with a colleague? That would mean a lot. As always, thanks for listening.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
Tea for Teaching
John Kane and Rebecca Mushtare
Teaching in Higher Ed
Bonni Stachowiak
Future U Podcast - The Pulse of Higher Ed
Jeff Selingo, Michael Horn
Dead Ideas in Teaching and Learning
Columbia University Center for Teaching and Learning