Intentional Teaching
Intentional Teaching is a podcast aimed at educators to help them develop foundational teaching skills and explore new ideas in teaching. Hosted by educator and author Derek Bruff, the podcast features interviews with educators throughout higher ed.
Intentional Teaching is sponsored by UPCEA, the online and professional education association.
Intentional Teaching
AI Across the Curriculum with Jane Southworth
Questions or comments about this episode? Send us a text massage.
Today on the podcast, I’m excited to share an interview with Jane Southworth, professor and chair of geography at the University of Florida and co-chair of the committee that designed UF's "AI Across the Curriculum" program. That program was designed in 2021, two full years before the launch of ChatGPT!
Jane shares about the role of artificial intelligence, particularly machine learning, in her landscape change research, and how that work get her involved in AI curriculum initiatives at UF. Jane also provides a lot of details on the new UF program, including the university-wide undergraduate AI certificate, AI-focused undergraduate research opportunities, and what turned into a herculean effort to get AI literacy embedded across the UF curriculum. I also asked Jane how the launch of ChatGPT affected this big project as it was being launched.
Episode Resources
· Jane Southworth’s faculty page, https://geog.ufl.edu/faculty/southworth/
· AI at the University of Florida, https://ai.ufl.edu/
· “Developing a model for AI Across the Curriculum: Transforming the higher education landscape via innovation in AI literacy,” Southworth et al., https://www.sciencedirect.com/science/article/pii/S2666920X23000061?via%3Dihub
· “Building an AI University: An Administrator’s Guide,” Joe Glover, https://www.ufl.edu/wp-content/uploads/sites/5/2023/11/Building-an-AI-university-An-administrators-guide.pdf
Podcast Links:
Intentional Teaching is sponsored by UPCEA, the online and professional education association.
Subscribe to the Intentional Teaching newsletter: https://derekbruff.ck.page/subscribe
Support Intentional Teaching on Patreon: https://www.patreon.com/intentionalteaching
Find me on LinkedIn and Bluesky.
See my website for my "Agile Learning" blog and information about having me speak at your campus or conference.
Derek Bruff (00:05):
Welcome to Intentional Teaching, a podcast aimed at educators to help them develop foundational teaching skills and explore new ideas and teaching. I'm your host, Derek Bruff. I hope this podcast helps you be more intentional in how you teach and in how you develop as a teacher over time. This past summer, I was at a conference and ran into Flower Darby, whom listeners might know from her co-authored books, Small Teaching Online and The Norton Guide to Equity-Minded Teaching. Flower has been doing a lot of work over the last two years supporting faculty explorations of generative AI in their teaching, and we spent a few minutes swapping resources and citations since I've been doing that work too. Flower pointed me to a paper from a team of faculty at the University of Florida about an AI across the curriculum program there. I thought, sure, that makes sense. Lots of universities are trying to grapple with ChatGPT and the like.
(00:56):
It sounds like UF is a little ahead of the game. Then Flower pointed out to me that they started planning this program in 2021, a full two years before chat GPT was released to the world. Today on the podcast, I'm excited to share an interview with Jane Southworth, professor and chair of geography at the University of Florida and co-chair of the AI across the curriculum committee that designed the new University of Florida program. Jane shares about her landscape change research. She's a geographer, which has been using machine learning long before chat GPT was launched and how that work got her involved in AI curriculum initiatives at uf. Jane also provides a lot of details on the new UF program, including the university-wide undergraduate AI certificate, AI focused undergraduate research opportunities, and what turned into a Herculean effort to get AI literacy embedded across the UF curriculum. I also asked Jane how the advent of chat GT affected this big project as it was being launched. Hi Jane, welcome to the podcast. I'm very glad to have you here and glad to hear about your experiences in the world of AI in the last few years. Thanks for being here.
Jane Southworth (02:09):
You're welcome. Thank you for the invitation.
Derek Bruff (02:11):
So I'll start with my usual opening question, which is not necessarily about ai. Can you tell us about a time when you realized you wanted to be an educator?
Jane Southworth (02:23):
It is an interesting question because I don't think I ever actually considered becoming an educator. I went to graduate school in America. I'm British, and I went to graduate school in America for practical reasons, but I don't speak foreign language and they funded me if I came over. And so I joined graduate school in the us and as part of your graduate program, I was funded by teaching and I had no prior teaching experience, which I think is very true for every graduate student coming into a graduate program. And I learned, I really enjoyed it. I enjoyed teaching. I enjoyed linking in with students who often were completely unfamiliar with the field of geography and teaching them about what it was and making them realize it's actually a really fun and highly relevant discipline. So America is very different than England in that. In England geography is a very, very common subject. We do it from age four, it's the most popular undergraduate degree. And in terms of jobs at the end, it's one of the highest paid undergraduate jobs in terms of when people go get jobs, It's Considered a really well-rounded education. You do math, you do English, you have to write, you have to speak, you have to design maps and other products. So I came to America and everyone thought I did rocks. Do you study rocks? Do you do names? And I'm like, I have no idea what you're talking about. So it kind of became fun during graduate school to educate and teach people. And then when I decided to go on to do a PhD, and then again, same thing, I was funded in part partly research and partly teaching. And so when I graduated I knew that I wanted to be in higher education and I wanted a position where I could still do both. I could do the research. So I kind of targeted R one schools, but I also wanted a significant component to be on that teaching education program building side. So a very long answer, sorry.
Derek Bruff (04:29):
Yeah, yeah. Finding a job where you get to do both of those is a little bit tricky sometimes in higher Ed.
Jane Southworth (04:37):
It is.
Derek Bruff (04:37):
So that's great. So given that background, what is your connection to artificial intelligence?
Jane Southworth (04:46):
Research wise? I have always used ai. So while AI has kind of blown up in, well, everywhere in society, every level of education is kind of talking about it predominantly due to chat GPT coming on the scene, I've always used machine learning as a tool in what I do. So my research is predominantly using satellite technology to look at land cover change occurring on the earth's surface. So I look at changes in vegetation cover often in and around national parks around the world, especially focused in Africa. I do a lot of work in Savannah systems and then kind of try and work out what's going on. So if there are changes, be it vegetation greening or loss of vegetation cover, then try and link with what's going on on the ground. So to do that, we use machine learning. We have images that contain millions and millions of data points, and there's no way to really run it and to basically take heavy pixel and assign it a value or a code if we want to say what kind of vegetation it is.
(05:52):
So we train models to do it for us. And that's machine learning. You take information about on the ground, you feed it into the computer and you train it with known locations to go and find other areas on an image that look just like that one does, and then label it the same way. So I've always done that kind of approach. I have never in the world called it ai, right? Those are just, I'm a satellite remote sensor, I'm a land change scientist. These are the tools that I've used forever. And suddenly I got an email from someone at the University of Florida over in the provost office that said, we would like you to be in charge of a task force. We've noticed you do a lot of AI research yourself, but also you're on every AI committee at UF. I was on college curriculum committees, college research committees, university-wide committees, and it was because whenever anyone did a search on machine learning, my name came up and they said, no, seriously, we did a social network analysis.
(06:49):
You are kind of this huge node in the college. And I said, well, what do you mean? And they said, well, we searched on machine learning and Oh yeah, yeah, yeah, no, I do that. Yeah, that's what I do. Yeah. So it was kind of funny. But then they approached myself and then Dr. Kati Migliaccio, who is another department chair over in ag and biological engineering, and the two of us did not know each other. So the first thing we did was we talked with each other before we decided whether or not to accept it. And Kati is one of the most amazing individuals I've ever had the pleasure of working with. And so very quickly we basically said, well, I'll do it if you do it. And so we became the AI across the curriculum coaches.
Derek Bruff (07:30):
Alright. Okay. So let's talk about the kind of timing and motivation for this, right? So the decision to have an AI across the curriculum initiative at the University of Florida came before the launch of chat GPT, is that correct?
Jane Southworth (07:48):
It came well before the launch of ChatGPT.
Derek Bruff (07:50):
Well before the launch. So what led to that decision? Because I can imagine a university in the year 2024 deciding, okay, we're going to do a lot with ai, plenty of universities have made that decision in the last 23 months or so, but you guys were well ahead of the curve and as you indicate, AI can mean a lot of different things. So I suspect you weren't imagining a chat GPT at the time.
Jane Southworth (08:15):
No.
Derek Bruff (08:15):
But what was the sequence of events that led to that email you got inviting you to be on that task force?
Jane Southworth (08:22):
So that sequence of events didn't involve Kati and me. Obviously we got invited in, I think it was September, 2021 is when we got our email. But prior to that, the University of Florida, they do a report every year. They do reports to in 2020 and 2021, they did on their accountability plans, which are published and available publicly online, they actually mentioned that they wanted to engage more within the field of ai. So they put in their university-wide plan to build programs, artificial intelligence. And a lot of the background of that honestly was from our then provost who is Dr. Joe Glover. And the way Joe tells it is he was walking down a corridor one day, he's a math professor, and Chris Malachowsky, who is one of the co-founders of Nvidia, was visiting uf and Chris Malakowsky said to Joe, what would you do with a supercomputer at the University of Florida? And Joe said we would build an AI university and AI across the curriculum. And so that I think is pretty much where it came from.
Derek Bruff (09:36):
Alright.
Jane Southworth (09:37):
Then there's a little bit, I think back and forth there, and obviously Covid occurred in there because this was pre covid and then during Covid was when actually they were building the supercomputer HiPerGator. But basically what came about is Chris Malakowsky, a UF alumni, gave 25 million to the University of Florida. Nvidia gave $25 million and for them it was in the form of computers and people. And then the University of Florida agreed to a commitment of $20 million to hire AI faculty in addition to the 400 plus of us that were already here working in those fields. And we hired, I think 106 faculty in
(10:17):
Colleges across the university in a two year AI hiring initiative. So that all set the stage then for AI across the curriculum as a program. So there was a lot of investment and a lot of infrastructure. They were building the supercomputer housing during Covid, and then we brought HiPerGator in, which I think at the time was in the top five of supercomputers on a university campus. Obviously as soon as it's like a new car, the minute you drive it off the lot, it loses value. I feel like supercomputers are the same in terms of your at the top for a very short period of time before you've been better. And NVIDIA's made just a massive investment and basically changed the landscape here at the University of Florida for us. So we're very fortunate that both Chris and Nvidia have supported that. And there are people here at UF on campus. There are a lot of staff and the computing side as well that have been hired and trained to help faculty use the infrastructure both on the research side, but also importantly on the instruction and teaching side as well.
Derek Bruff (11:24):
Yeah. So what does that look like? What does an AI across the curriculum initiative, what are some of the moving pieces for that?
Jane Southworth (11:33):
So the AI across the curriculum program that Kati and I were asked to develop was part of the Quality Enhancement Plan, which is part of our SACS accreditation process. So for SACS accreditation, you're required to design basically a new and innovative program that you'll run for at least five years, and it's part of your accreditation process. So along with those people coming to campus and looking at all the different things the university is doing, we also had to present this plan for this new initiative. So for us, that new initiative that we were told was AI across a curriculum, here's a task force of 40 plus people from all over campus. Off you go to which Kati and I said, what's a QEP? Cause we never heard of it, right? On a campus as a faculty, you might see the result of it. So the one before this was internationalizing the curriculum, and I remember seeing all the advertising for it.
(12:29):
Obviously we're a geography department, a lot of our faculty involved are in international programs. So I knew a lot of the programs that had come about, but I had no idea that it was part of this larger QEP program. And I think that's probably pretty indicative of most faculty on a campus. So for the QEP, we pulled together people from all those colleges, a lot of them associate deans or higher ups, as well as all the way down to undergrad students. So very diverse task force. And Kati and I were given free reign to basically build this program, and we looked at the internationalize and the curriculum program in terms of the QEP, it's like a hundred page document that lays out the plan for the university to run out for five years. And importantly how they're going to assess that program.
(13:15):
Our program officially started in fall 2024, so we presented it back in February to SACS when they came on campus for our accreditation process, it got approved and it started this fall, and now we'll run for five years here at the university. So we kind of had five pieces that we came up with. And the first one was that AI needed a home, that we wanted AI across the curriculum. And immediately what that means is most people thinking of ai, think of a computer robot and engineering or computer science. And we did not want that, right? If we said, if we're truly going to have AI across the curriculum, no one can own it other than the university. So we designed, I mean, I laugh now thinking of it because we met with the provost that we didn't really know and said, Hey Joe, this is a structure we think we should have.
(14:08):
And by the way, you need a center and we need it now. We can't wait a year or two for this, and it needs to be done. And so we had this one of our first meetings with Joe Glover, and we said, we've had our first task force meeting and everyone agreed there needs to be a home for all of these programs, these academic programs. We're building the assessment pieces because otherwise, and especially on a university campus, the size of the University of Florida with 55,000 people, you need to know who is reporting on recording, who is collating all the information that is going to be reported back to SACS for our accreditation. So we want to design that home upfront and we want an AI center. And we suggested it have a research arm and an academic arm, but obviously at this point, we were just focused on the academic initiative.
(14:57):
And Joe laughed at us literally on the phone. Hopefully Joe will never hear this. No, it's fine. And he said, yeah, no, I don't think we need another center. And again, you're associated with many different universities. We love our centers and our institutes and everything else. It is really a need for another one. And we said, another really is because this is reporting and accreditation. So we signed off that meeting, I think it was an hour meeting, and we were told pretty much, it looks like you're going on the right path, but no, you're not getting the center. And it was in February and the next day, Joe said, Hey, can we jump on a call? He said, so about that center, I've been thinking, yes, we need a center, and we had a center at the University of Florida with a million dollar budget within three weeks.
Derek Bruff (15:43):
Wow.
Jane Southworth (15:47):
So it was taken very, I mean, it was really nice being part of this task force where the provost was fully invested in the entire process and would listen to the committee.
(16:01):
So I mentioned there were five ideas. The first one was the center. And so we had a whole subgroup for the center, and they were like, we're done. We need to do our bits done. And we're like, yeah, good try. Get back in with everyone else. So then that was the first one. Our other initiatives were all towards classroom and assessment. So the University of Florida designed a university-wide certificate program called Fundamentals of ai. That program had an intro course called Fundamentals of ai that's taught by engineering. It had a second required course, which is philosophy and ethics of ai, also required taught in the College of Liberal Arts and Sciences and the philosophy program. And then the third piece of the certificate was basically every college put forward the courses that they wanted to fill the gap. So no matter what college you were in, if you wanted to do this university-wide program, you could take any course as your class three, right?
(16:58):
One and two are set, this third is more specialized, so you can make it match your major, or you could do something in another college. It didn't matter. It was completely up to the student, but it was a nine credit hour certificate that you could then say, okay, I have done some fundamentals of AI programming here at uf. And there was no assumption of knowledge going in. So the baseline for that intro program was a lot of GUI interface as opposed to you going in and coding. So it was for English majors, it was for engineering students, for anyone who wanted it. But in addition, a lot of programs also designed their own AI certificate. So for example, in geography, we have a geo AI certificate program because we are geospatial scientists, and geo AI is a huge field. So we designed our own certificate program that our students could take or anyone else who wanted to take that within house. So that first certificates programs, that's collating what's already there and everything being run and managed in terms of recording for numbers through this new center we'd created, which became the AI squared center. It's like the academic initiative for artificial intelligence. So ai, ai.
Derek Bruff (18:10):
AI squared. Yeah. Quick question about the certificates. Did the university already have a kind of history of doing undergraduate certificates, particularly ones that cut across colleges?
Jane Southworth (18:25):
No, on the latter, yes, on the former. So our students love certificates, and I think in part it's because they can do something outside of their major and show an area of expertise for when they're going on the job market. So they don't have to be in a major for most of the certificates. They can take them in any program. And it's a great way for you to stream courses. So if we have undergrad students, for example, in Department of Geography, we have four undergrad certificates. We have the new Geo AI one, we have a GI Science Geographic Information systems course. We have medical geography and global health as a certificate, and we have metrology and climatology. So you can kind of see they're kind of focused within areas that show an area of expertise and focus. The cross-cutting university-wide certificate is a lot clunkier to manage, honestly.
(19:21):
So the way we run certificate programs at uf, there's a certificate coordinator. So that's easy when all your courses are in the department, you can go and look at transcripts. It's a lot clunkier when you've got to manage it across different, not only departments but colleges. But they did put that in, and it is run, and I think it's run out of engineering where there's a lot of amazing people involved in the AI program. And so the purpose of it though was for it to be campus-wide and anyone could take it, there was, there's kind of no input coming in that you had to have expertise. And the other thing that I think is really cool is so that first course is fundamentals of ai. It's all about engineering and programming. And the second required course is philosophy and ethics. And so they're given equal weight.
(20:08):
And I think that was critical, and that made a lot of people more willing to engage in AI because they were suddenly like, oh yeah, no, I could see, I'd have to know something about the background of AI and how it is created and applied, but I'm really interested in the implications of AI for society, the implications of the ethics and the bias and all of the problems potentially with AI in society. So I think that brought a lot more people on board. So our certificate program campus wide, I think is really, really cool. And it's already had hundreds of students go through it. It's proven popular already. That was, so we did the center, then we did certificates and programs. The next piece we did is something called AI Scholars. And to be completely honest, in these next two, we have AI scholars and we have an AI medallion, and they're both stolen completely from the internationalizing the curriculum QEP,
(21:00):
They had the exact same thing. So there was a structure in place. It made sense, right? Build on something we already have that works. They have I think a series of nine different options, and they do I think four or five of them to become an AI scholar. Oh, sorry, that's medallion. That's medallion program. So the medallion program, you basically hit on all these different areas. So I mean, I can send you the document. Anyone can go online and get the QEP document, but there's a whole series of options. You can do nine credit hours of coursework tagged as ai, which I'm leaving till last because it's the most complicated one. You can be an AI scholar, you can do a CURE class, which is the undergrad research experience course. And we have ones that are with a focus on ai. We also have ones that are ICUREs where industry is a partner, and you're addressing a problem using AI tools as an interdisciplinary team in a class to address something that is of interest to industry.
(21:58):
So you're also getting some professional development skills for those students develop. And then AI scholars are also part of, that's one of the things you can be if you want to be a medallion, that's kind of the pinnacle of the AI program. So as you're going through all these different options, coursework, certificates, the scholars, all these different things, if you do four or five of those all managed and basically counted through the AI squared center, then you are an AI medallion student, which means you physically get an AI medallion to wear a graduation. So we have bling that goes with the AI program students.
Derek Bruff (22:39):
So that sounds like it's another form of certification, but it's kind of a curricular, perhaps co-curricular piece.
Jane Southworth (22:48):
Yeah,
Derek Bruff (22:48):
it's not Just straight courses. Yeah.
Jane Southworth (22:51):
Yeah. You've put a bunch of different pieces together. You've shown that maybe you got involved through the certificate program, but then you're like, Hey, if I do 3, 4, 5 more of these, including events that occur at the AI squared center, they host, we're about to have AI days here for two days on campus. We do a lot of panels and discussion. They do presentations, all these different things, and they can just check off this list. Then when they get to five of them, they're a medallion.
(23:16):
So one of them I mentioned is AI scholars and AI scholars. We have a scholars program at UF already. Again, wherever possible, we just jumped on the bandwagon of something we already had. So a scholar program here is you apply to do research with a faculty member, and we have college-wide scholars, we have university-wide scholars. You work in a faculty member's lab or research lab or computer lab, and you work with them. You get paid, I think a couple thousand dollars, I can't remember exactly, because the college versus the university versus the AI all pay slightly differently. And the faculty member also gets $500 and they can use that for anything. Most of us use it back on the student to send them to a conference or so they can present their work. So that program is already here at us. So we just made an AI version.
(24:05):
Again, run through the AI squared center. You apply once a year and it's a very popular program. You say what you want to do and who you want to work with. You get a letter from the faculty member. And then obviously in the case of AI scholars, you are linked in with the field of ai. So I had one that just worked with me, Patrick in Namibia, looking at land change and using machine learning models as a way to incorporate that. But he also spent the summer in Namibia doing the field work and actually saying, okay, I know what it looks like on a computer, but what does that mean when I get to the ground? And actually interviewing community members about how they're using the land that he's looking at from satellite imagery. So absolutely amazing program. And he went there with a collaborator of mine, Dr.
(24:48):
Brian Child who works in conservation in Africa. There's a lot of that kind of interdisciplinary linkages. So we have AI scholars, AI medallions, the different programs and certificates, and then the AI squared center. But the big piece that I think we have gone through pain and suffering and torture to get are AI courses. And what we mean is you have gen ed courses at any university where you do a biology or a math or a science. Here at uf, we're designing an AI tagging system for our courses so that every class that a faculty member is interested, they don't have to, but if they're interested in getting a course AI certified saying that their course contains significant content of AI literacy, then they can go in and get them tagged. So we found some great literature, and Kati and I compiled that for the curriculum group.
(25:51):
And basically we designed, based on all of known literature, we're not pulling anything out by ourself. Four different AI literacy types that here at the University of Florida, we were going to tag courses based on obviously the person who is submitting a course for the curriculum review, just like they do for every class. Now we have, so you design a new course, it goes to the curriculum committee. If it's a gen ed course, it also goes to the gen ed committee here at uf. And now if it's an AI course, it also goes to the AI curriculum committee, but then it's tagged, and it's really important that it be tagged as AI number one, so that when you are going through your accounting system in terms of what you've done while at uf, those nine AI classes that count towards your medallion, I mean, nine credit hours of AI courses count towards a medallion that you're trying to earn.
(26:43):
They can just look on your transcript and they're tagged. More importantly, from day one of this process we worked with here at uf, our career connection center, which is the C3 center, apparently we really like numbers and all the rest of it. So we work with Ja'net Glover, who's the director of that career center. She was on the task force, and from the very first task meeting, I would say we bonded, but also she was straight off bat like, this is amazing. I love UF is doing this. But what I'm concerned on the career connection side is that we are able to identify the skillsets that students are getting at uf, so we can translate that for industry and employers on the other side. Otherwise it's useless. So Jeanette has worked with us and her entire group has worked with us the entire way. And when we designed the AI squared center, I mentioned it had a pretty high budget, and that was because they got to hire people.
(27:40):
One of the people they hired is an AI career expert who is housed between the center and the career center. So that from the very beginning, they're paired, because otherwise it doesn't, I can't, outside of a very narrow field where my students might get employed, I can't translate that larger area. So this was a way to link to the literacy and the pedagogy by tagging courses based on really, really strong theory, and then linking it to actual skills that we could then translate. And having experts in AI helped do that translation process at the Career Center on campus. So the literacies that we designed were basically know and understand ai, which is what are the basic functions of ai and how would you use AI applications? So very, very basic. And for all of our tagging, we basically say to our faculty, you have to have at least 50% of your course content covering these to get tagged as an AI class.
(28:41):
Then we have use and apply ai, which is simply applying AI knowledge concepts, different applications in different scenarios. So you'll notice that so far they're not coding, they're not engineers, they're not computer science people. So the third one is that it's evaluate and create ai. So for those people, we're looking at higher order thinking skills. They're predicting, they're designing with ai. So those are kind of the ones that I think everyone thinks of is ai. But here it's just one group. And then importantly, our fourth class is AI ethics. So our category of AI literacy, we have a quarter of our program is in ethics. And so here looking at human base, human centered considerations, fairness, bias, transparency, safety, all of those bigger issues. So now people coming in interested in AI and wanting to get that credit on their class, you might gen to help recruit into your course, they can put in, this has been a torturous process because we wanted it on the transcript. We wanted it done through the official UF system. So I'm sure you imagine the months of work on that one
Derek Bruff (29:53):
At Vanderbilt on one level, such a task would've been easy. Our registrar can add tags to courses at the drop of a hat. But what does that do functionally? All it means is if students know about the tag, they can use it to search for courses, but it doesn't show up anywhere else. So it's everywhere else. That's the big lift.
Jane Southworth (30:14):
The real thing we wanted to do was to institutionalize AI at uf, and we thought the curriculum committee, I mean if I'd known how much pain suffering and meetings I'd be involved with in getting this, I mean, I've talked to Katie and I had meetings with pretty much every level of UF all the way up about just the logistics of doing it. But we really felt, and I think Joe Glover at the top level really, really wanted this to be a process that is, okay, there's a QEP, but this is AI across the curriculum. Yes, QEP is a little portion, and this was a way we thought of doing it, and this is, we're starting it. So our group has been meeting, I'm also on that curriculum committee on every committee, and Katie is as well. And so we're working on building that through.
(30:58):
So currently we're going through courses that are in part of all these certificates that are kind of the easy ones while we work out the system. And then it'll go live at the end of fall for everyone will be, I mean, it's already up and live, but so everyone will be notified, but it's going to be a bit of a horror show initially with hundreds of courses trying to get through, which is great. That's what we want, but it's going to be a lot of work. So it's just we're going through that. And then we also plan to do the same thing at the graduate level. So right now, that process is, our undergrad and grad curriculum committee is a slightly different processes, but once we have it operational for one, we'll just transfer it to the other. But it's really important for the undergrads.
(31:37):
I think grad students can articulate more what they're learning and know what they're going into undergrads. They might not always know how to translate those skills into workforce and real world skills. And that's why we have, and our career center is amazing. Ja'net Glover is amazing as a director. And so for them to realize right up front, number one, yes, we want to be part of this process and go to all these random task force meetings with you, but number two, we want to invest in this. We want a person that's going to be responsible. We want to really build this hand in hand with you. I think has made a significant difference because from the beginning of the program, we've been thinking about the end point, which is students graduating and getting a job. And I think that helped us see it as a bigger picture. Why are we doing it? We're doing it so our students can be trained. And there's so much literature, like the World Economic Forum, thinking 2022 put out a whole piece. And it was the biggest gap we have in the workforce in the US is in ai.
(32:33):
We do not have a trained workforce yet, and we need one, and we need it by '25, which of course, isn't doable, really. Not on a large scale because as we know, we're still working on the curriculum committee and getting these things. Universities take a while, and we've been quick. I mean, we have done this quickly. Even though it started pre covid, it has been a pretty rapid transition. And so I think really when we say AI across a curriculum, which again just started as far as students are concerned in fall 24, so this is brand new for them. I'm sure a lot of them aren't fully aware of it yet. And a lot of them will become aware as they're in a class and someone says, Hey, you should take this other course and you're halfway to a certificate. What certificate? Why would I want that certificate? So I think we're at the beginning of the process for making the larger campus aware. I think a lot of them still, it's like ai, oh, whatever that is. Not realizing, hey, if you are on Facebook, because we're that age, if you're on Facebook and you were just talking about how you need a new red jacket, and you are amazed that suddenly these red jacket advertisements are appearing on your Facebook feed, well, yeah, welcome to ai.
(33:36):
You use Waze when you're doing your traffic search, you use an ai, right? It's ubiquitous in society. And I think the scary thing actually is that people dunno that, right? The people aren't aware that it's already everywhere. This is nothing new. This is something that industry has been using for years. Chat, GPT has made everybody aware of one teeny tiny piece of ai, but it's part of a much larger ecosystem. And I think it's really exciting, and I think for some people it's really scary. But I think as with everything that is like that, it's all about educating people about the possibilities of it, the strengths of it, but also the weaknesses and limitations too. So I think we're really proud of the fact that that AI ethics is a cornerstone of all of our courses and all of our certificate programs.
Derek Bruff (34:19):
Yeah. Well, I want to follow up on your mention of ChatGPT, because I do think that a lot of higher ed was thinking about AI very differently before November of 22. With the advent of these readily accessible large language model chatbots and similar tools, has that changed anything about the AI across the curriculum initiative?
Jane Southworth (34:44):
No. I mean, in terms of the initiative, no, because it's about the larger AI world. I think people are more aware of ai, but in a way they're not. They're aware of ChatGPT, which as you say is one little version of it. I think it has made a lot of people, well, there's two groups. There's the people that are like, oh my God, this is amazing. I can write an essay in two seconds. And then there's the other one that are like, oh, this is awful. They can write ansaa essay in two seconds, and now I have to be able to find it and chat and tag it. So I think from day one, my view is, and this is just my view, this is not the curriculum or ufs view, but my view is you can't put it back in the box. These large language models, chat, GPT, all of these, they're all here to stay. So as with everything else, we should be teaching students how to use 'em well.
(35:35):
And so I think using it in a course where you can say, I mean, I do it, I tell my students like, Hey, go in and use this and type in the question and let's see what comes out and tell it to add citations and it'll do a wonderful job, and those citations are going to look real. And they're all complete fiction. It makes up citations. It literally creates things by people that look like they should have written that paper. And they'll invent facts. And the more you ask it, the more it'll invent it. We'll just create things for you because it's a wonderful tool. And then instead of having them be unaware, have them go in and actually evaluate and critique and improve upon what ChatGPT did and teach them how to engage it in an appropriate way to acknowledge they've used it.
(36:24):
Right. I think that's really, really critical that if asked, did you use a tool in any of these? Say Yes, and this is how I used it, and these are the prompts I used. And I think that's a much better way to engage it. I know in some disciplines, they're really concerned about its use, and in that case, you can always do a pen and paper exam in a classroom store, right? I mean, it's not that you can't do that option. I'll probably be hated for saying it, but I think a lot of us probably taught things in the same old way we always had. And I would argue that the introduction of a tool like Chat GT is a bit of a slap about the face for all of us of like, well, if it can be used to answer everything you've given students, are you really getting them to think critically and engage? Probably not. So maybe it's about time you change what you did and how you're doing it, and really have to think about the assessments you're doing and how you are integrating them and teach students how to use those tools that are really good tools in an appropriate way. And I think that is our job as educators, no matter what the tool is.
Derek Bruff (37:33):
That's great. That's great. Well, Jane, thank you so much for taking this time today and walking us through this really, really amazing program. This has been delightful. Thank you so much.
Jane Southworth (37:43):
Yeah, thank you for the invitation, and hopefully five years from now, all universities have similar types of programs, right? And it's passe, but it's been fun.
Derek Bruff (37:58):
That was Jane Southworth, professor and chair of geography at the University of Florida and one of the architects of UF's AI across the curriculum program. Thanks to Jane for taking the time to talk with me and for walking us through the ins and outs of this major initiative. I keep thinking about the hard part of what Jane and her colleagues did, setting up a system to tag AI literacy courses in a way that can actually have some impact. I remember efforts at Vanderbilt to encourage faculty to embed more digital media literacy into their courses by having students say, produce podcasts or make videos or design infographics. Those efforts could never really go beyond a set of early adopters because there was no curricular incentive to design courses like that. Such courses didn't count for students' general education requirements because there was no digital media literacy requirement.
(38:45):
Although the UF program doesn't make AI literacy a requirement, by weaving that tag into the course catalog and setting up career advising structures to align with those literacies, I think they're well on their way to seeing big changes in the curriculum. If you're interested in exporting some of what the University of Florida has done, Jane pointed me to two articles that are worth reading. One is that paper that Flower Darby shared with me, the one that I mentioned at the top of the show that outlines the motivation and design of their new UF program. Another is a Guide for Administrators called Building an AI University by Joe Glover. See the show notes for links to both of these documents. Jane also said that UF regularly hosts teams of faculty and administrators from other universities for two day campus visits to tour their programs and resources for any institutions interested in doing similar work.
(39:36):
Intentional teaching is sponsored by UPCEA, the Online and Professional Education Association. In the show notes, you'll find a link to the UPCEA website where you can find out about their research, networking opportunities and professional development offerings. This episode of Intentional Teaching was produced and edited by me, Derek Bruff. See the show notes for links to my website, the Intentional Teaching Newsletter, and my Patreon, where you can help support the show for just a few bucks a month. If you found this or any episode of Intentional teaching useful, would you consider sharing it with a colleague? That would mean a lot. As always, thanks for listening.