Intentional Teaching, a show about teaching in higher education

Surviving Peak Higher Ed with Bryan Alexander

Derek Bruff Episode 89

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 47:19

Share your thoughts about this episode as a text message.

The total number of students enrolled in US higher education institutions grew steadily in the 80s, 90s, and early 2000s. However that total peaked in 2011 at around 18 million students. It’s been declining ever since. You can imagine some of what that means—fewer students means less tuition, which means fewer faculty and staff and the closure of colleges and universities. US higher ed has been on the downhill across multiple measures for about 15 years now.

That decline is the focus of Bryan Alexander’s new book Peak Higher Ed. If a whole book on the crash of higher ed sounds grim, well, there’s some hope in the subtitle of Bryan’s book: How to Survive the Looming Academic Crisis. See, Bryan Alexander is a futurist—his work helps us imagine what might come next for higher ed and what steps we can take to navigate those challenges. I’m excited to have Bryan, who is also a senior scholar at Georgetown university, on the podcast today. We talk about the methods that futurists use in their work, the shape of higher ed’s current decline, the possible futures of generative AI and how higher ed might respond, and lots more.  

Episode Resources

Peak Higher Ed by Bryan Alexander

Bryan Alexander’s website

The Future Trends Forum

Bryan’s other books

Bryan’s 2020 appearances on the Leading Lines podcast

Support the show

Podcast Links:

Pre-order The Norton Guide to AI-Aware Teaching by Annette Vee, Marc Watkins, and Derek Bruff.

Intentional Teaching is sponsored by UPCEA, the online and professional education association.

Subscribe to the Intentional Teaching newsletter: https://derekbruff.ck.page/subscribe

Support Intentional Teaching on Patreon: https://www.patreon.com/intentionalteaching

Find me on LinkedIn and Bluesky.

See my website for my "Agile Learning" blog and information about having me speak at your campus or conference. 

Bryan Alexander

This is all about giving people options and giving people visions and possibilities that they can choose from so that people have agency, that their choices matter. I want to make sure that everybody listening to this feels that sense of agency.

Derek Bruff

Welcome to Intentional Teaching, a podcast aimed at educators to help them develop foundational teaching skills and explore new ideas in teaching. I'm your host, Derek Bruff. The total number of students enrolled in U.S. higher education institutions grew steadily in the 80s, 90s, and early 2000s. However, that total peaked in 2011 at around 18 million students. It's been declining ever since. You can imagine some of what this means. Fewer students means less tuition, which means fewer faculty and staff, and the closure of colleges and universities. US higher ed has been on the downhill across multiple measures for about 15 years now.

Derek Bruff

That decline is the focus of Bryan Alexander’s new book Peak Higher Ed. If a whole book on the crash of higher ed sounds grim, well, there's some hope in the subtitle of Bryan's book, How to Survive the Looming Academic Crisis. See, Bryan Alexander is a futurist. His work helps us imagine what might come next for Higher Ed, and what steps we can take to navigate those challenges. You know that scene in one of the Avengers movies where Doctor Strange goes all timey wimey and sees millions of possible futures so he can identify the one future where the Avengers win, and then he tells Iron Man what to do so they have a chance of landing in that future? It's a bit like what Bryan does, but with research and scholarship instead of an Infinity Stone.

When did you realize you wanted to be a futurist?

Derek Bruff

I'm excited to have Bryan, who is a senior scholar at Georgetown University on the podcast today. We talk all about the methods that futurists use in their work, the shape of higher ed's current decline, the possible futures of generative AI and how higher ed might respond, and a whole lot more. It's a very deep and wide-ranging conversation with someone who knows this space very, very well.

Derek Bruff

Bryan, thank you so much for being on Intentional Teaching. I am very excited to talk to you today about your new book and um the possible futures of higher ed. Thanks for being here.

Bryan Alexander

Oh, thank you, Derek. It's always good to be um uh with you in any situation, and I'm really grateful to you for hosting me.

Derek Bruff

Yeah, yeah, I'm I'm excited about our conversation. Um, before we get into the book, I want to ask you a version of my usual opening question. I actually realized I asked you my usual opening question six years ago when you were on my old podcast, so I'm gonna ask a different one. Can you tell us about a time when you realized you wanted to be a futurist?

What is "peak" higher education?

Bryan Alexander

I could think of two times. I could think when I was a kid, and I read Alvin and uh Heidi Toffler's Future Shock, or I read as much of it as I could, um, and it blew my little mind, and I was so excited and made so much sense. And I thought, yeah, yeah, yeah, this is something I should do. But then I would think as an adult, uh, around the year 2010 or so, I'm talking with many, many academics about emerging technologies for teaching and learning, and realizing that what I was talking about was the future of education, and that that framing was much more powerful and much more productive than talking about the uh emerging technologies for education and learning. Uh, that really I think that was that sense of, oh, I've got to be doing this.

Derek Bruff

So, what is a futurist and what um and how is your book, Peak Higher Ed, an example of futurist work?

Bryan Alexander

A futurist is someone who helps people think more creatively and strategically about the future. I mean, that's my understanding of the field and the function. And this book, Peak Higher Education, is an example of one kind of futurist thinking, which is called scenario analysis or scenario um uh development. That's when you create a story about a potential future, you flesh it out in detail, and then use it to imagine how the world would be different if this kind of thing happened. Uh scenario analysis really goes back to the 1970s. Uh the great uh futurist uh Pierre Wack did some classic scenarios for Shell oil, and scenarios are now very, very widespread, uh, in part because scenarios are stories and people respond well to stories.

Bryan Alexander

Uh so years ago, back in 2012, 2013, looking at some enrollment data, I had a kind of intuition wave in my brain that I thought, uh, it's possible higher education might be peaking in certain ways. Let's test this out. So I did a quick blog post about this, and then I kept digging into it, I kept following it. And uh I tested this out with audiences, both online and in person. Uh this appeared in my first Johns Hopkins book, uh, Academia Next, uh, you know, fully fleshed out as a chapter. And it really held out for about 11 or 12 years. And so I I was the only one who made this call. Um, and so I'm I'm fairly proud of it. And so my publisher said, you know, why don't you why don't you t take this to full bull book length? Why don't we see how you know what has happened? How did we get to peak higher ed, what happens now, and what might happen next. So that's you know, the the first half of the book is mostly about working this out in detail, what peak means, how we got there, how it's you know, how it's playing out. And the second half of the book is some different ways that higher education might respond.

Derek Bruff

And when you say peak here, you are um having read most of the book now, you are meaning more than what um we might have heard of as a demographic cliff. And in fact, I think I realized I had a pretty fuzzy definition of demographic cliff. It's it's a more precise term than I thought it meant. Is that could you say what is a demographic cliff and how is that part of peak?

Bryan Alexander

Well, first I understood peak. My initial understanding of it was in terms of total enrollment. So how many students American colleges and universities are enrolling? And this is a very US-centric book, although the patterns it discovers can apply to certain other nations. So this is different from my previous book, University is on Fire, which is truly a global look at higher ed. Um and I was looking at how total enrollment in American higher ed started to grow in the early 1980s and just grew and grew really steadily until about 2012. And there's there's no name for this period. Uh Clay Shirkey, I think partly tongue-in-cheeks, as we should call the golden age. And he might be right. Um, and what happened was that for the next decade plus, enrollment ticked down uh before the pandemic. The pandemic really saw drops. Now, the past year and a half or so, enrollment has started to bounce back. But what's interesting is it's entirely based in high schools, um high school students taking classes at community colleges. It's called dual enrollment. Um there's no big boom in liberal arts enrollment, there's no huge flood of students to research universities.

Bryan Alexander

Um but along with that, since enrollment is crucial to the supermajority of institutions in terms of their finances, uh, I was afraid that this might mean we would see a peak as well in total employment of faculty and staff, as well as total number of institutions. And I've been correct. I mean, the total number of institutions which grew and grew maxed out, and we've seen a small decline since, uh, and we've seen all kinds of cuts to program staff and faculty across the country. Um so I mean uh that's and the reasons for this are multiple, and one of them is demographics.

Bryan Alexander

So when you mentioned the demographic cliff, there are really two different there's a very precise sense to this, as you said. Uh my friend Nathan Grawe came up with us in a wonderful book where he looked specifically at one year. He looked at 2008 and made the case that because of the financial crash of that year, lots and lots of parents decided not to have kids. So as a result, we have a little demographic hole that's been working its way through the K-12 system and now is hitting a higher ed. So there's a you know a decreased amount of children. We have fewer kids than we would otherwise have had.

What are possible futures of generative AI?

Bryan Alexander

But I think broader than that is what demographers refer to as the demographic transition. And that's a much bigger, deeper uh transformation. I think we're all pretty familiar with it now. That means that a society that goes through modernity uh tends to have fewer kids and tends to, the adults tend to live longer lives. And we've seen this across the world. Uh we've seen this from you know all over Europe, uh all over North America, uh, definitely China. China is now the second most populous nation in the world. And, you know, the reasons for this are pretty straightforward. You know, you talk about a society that gets uh improved health care, that gets improved medical care, uh public health rather, uh, that gets richer. Uh but especially when women get more education, more accused to labor force, and more reproductive autonomy, once you ramp all those things together, childbirth falls off a cliff. And uh then that's the real cliff that we're working with now. And the the this has impacts for everything, for a nation's self-concept to how you fund elder care, pensions, medical and services, uh and in higher education to the degree that we teach traditional age undergraduates, people aged 18 to 22 or so, it means that pipeline is narrowing. And if magically tomorrow all kinds of women across the US suddenly got pregnant, it's not gonna happen. That's actually kind of horrific to think about. But but if that were to happen, that wouldn't impact higher ed for another 19 years. Right. So we've got two decades uh and you know, most likely quite a bit more uh to to think about for this.

Derek Bruff

Yeah. Yeah. Well, um, I want to focus on a couple of chapters in your book. Um uh one is the chapter on AI, which is a topic we talk a lot about on this podcast. Um, and I'll start there. Um and uh that chapter actually uses a different, a slightly different futurist technique. Um I was not familiar with uh is it James Dator? Is that how you say his name? He has a four-fold model for categorizing futures. Um can you say a little bit about what that model is and why it it uh why you chose it for the chapter to help us imagine the future of AI and higher ed?

Bryan Alexander

Sure. Uh uh Jim Dator is just one of the great um sages in the futurist field. He's at the American, he's at the University of Hawaii. Uh and back in the 1970s, he first came up with this. He basically sat down with hundreds and hundreds of futurist books uh and uh came up with what he initially called archetypes. And the four archetypes are pretty clear. Uh one of them is growth. So you're taking a look at a vision of the future, and it might be a total vision. You know, here's humanity, or it might be the vision of a nation or region or religion or business. But but one of those patterns is growth. It will get bigger and better. I mean, that's just clear. Uh a second is collapse, which is pretty self-explanatory. It's the opposite. Uh, a third is what he called discipline, and that's when a society really becomes more intensely organized. Um, one of the examples that he likes to talk about is if we were to have a kind of ethical transformation and become anti-consumerist. I like to think about it as a society that goes through a revolution, like say the Russian Revolution, the Chinese Revolution, or that goes through a very strong uh wave of religious uh control uh or theocracy. Um that's discipline. And the fourth is transformation, where things just get strange, um, which is uh

Bryan Alexander

so I I came to these because, well, these are classic futures models, or it's a classic futures model, but also because when I look at AI, I find it to be so fluid and unstable. And I mean, in the education space, we often talk about how we have all these unpredictable natures of it. You know, we're not sure how it impacts teaching and learning, we're not sure how to structure it in terms of of uh anything from curriculum to policy to support. Uh, but also I think beyond education, in the vital context we shape our approach to AI, thinking about AI as a technology, it's again, it's it's you know large language model. There's a lot of arguments about if this is going to be the next seed for artificial general intelligence, or if it's a dead end to computing and needs to be superseded. Uh, there's questions of economics about, you know, is this, is the LLM build out a bubble about to burst, uh, or is it infrastructure that we're gonna live with for generations to come? Uh the impact on the labor market, we don't know. It could cut in many, three very, very different ways. And the impact on geopolitics and society and individual psychology. So I mean, I I I wanted to sum up all of that instability, all of that chaos. And I thought Jim's four different um, you know, uh templates were a really good way of doing that.

Derek Bruff

And so uh this is the book is a little hard to get one's uh head around, I think, in the sense that you're not really predicting a future. You're it's this scenario piece where you're saying if the story continues in this direction, that in fact 2010, 2012 was the peak of higher ed along these different metrics, what are some possible, what happens, what could happen next? Um, and in this in this chapter, kind of how does AI intersect with that? Am I reading that right?

Might higher ed be moving toward a "discipline" future on AI?

Bryan Alexander

Yeah. You are completely correct. So so one possibility is that the way AI develops and the way that we acculturate and use AI, the way we think about AI, might worsen the peak situation. It might drive us further downslope. That is, for example, we might see that a lot of people, uh, maybe a plurality, think that AI is accessible, friendly, cheap, and good enough for what people want to do. And they might turn to consider higher education to be the opposite, to be expensive, unfriendly, perhaps out of touch, perhaps awkward to use. Uh, and as a result, we might see not just enrollment continue to drop, but we also might see public attitudes get even worse, funding get even worse, and so on. Now, the the opposite is possible as well, that we might determine that the hallucination problem for AI is too great and unhandled, uh, whereas humans and academics are more trustworthy. Um, AI might seem kind of morally dubious, politically uncomfortable, whereas academia will seem less bad in contrast. So if that kind of scenario happens, then we might bounce back a bit uh with the help of AI. Um I mean, you know, these are you know two possibilities for it.

Derek Bruff

Yeah. And so I I think that was that first one you mentioned is kind of the growth scenario where where AI strengthens over time in some fashion. And the other is a is a collapse scenario where AI kind of loses credibility. Um and I think those match on pretty well to some degree with what I hear in the discourse in higher ed, which is a kind of a pro-AI stance or an anti-AI stance.

Derek Bruff

And I know things are more complicated than that, but I I wanted to get your take on something because I feel like um when I'm working with faculty and they take some time to think about their learning objectives, their students, their course context, they understand AI and its affordances and its limitations, and they start to make informed intentional choices about the role of AI in their teaching. I find that a lot of faculty land on some mix of resisting AI and engaging AI. And I wonder if that's a version of the discipline future where we kind of land on a fairly stable set of norms and responses to this new technology. Um, is that would you agree that that's an example of discipline? And and do you think like, does it seem like we're heading there now?

Bryan Alexander

Um I would agree that that's an example of discipline and that it's possible. However, I find it unlikely uh and and for a few reasons. One is that the divides over AI in the academy are so deep and so dug in right now that I I I reconciling them sounds really, really difficult. That's one problem. Uh a second problem is that uh colleges and universities have generally not made a strategic posture or turn towards AI. That is, uh this may sound kind of counterintuitive, but um last year's CHLOE survey found that only about 23% of colleges and universities actually had an AI policy. And and that policy was construed pretty broadly. You know, it included the you know uh taking your pre-existing academic integrity policy and sticking don't cheat with AI into it.

Derek Bruff

Yeah.

Bryan Alexander

What's what's happening is that the majority of colleges and universities at the top level uh are basically punting on the question of AI. And they're deferring that down to individual units and departments. So, you know, you look at say career services and they're trying to figure out how to repair students for uh a world of work saturated by uh AI or reflected by AI. You know, you take a look at IT departments, which are trying really hard to figure out how to license this stuff, which is really challenging, how to deal with security issues, how to deal with support issues, but especially in the faculty, um you know, really it's it's down to individual units like colleges, universities, departments, and they haven't gotten any extra resources to do this, so they deferred further down the line to individual faculty members. And since they should be well known, the largest, largest population of faculty in the United States are adjuncts, it means that adjuncts are really leading the way towards how American colleges and university and the adjuncts are of course not only the the worst paid, but also they have very little support.

Derek Bruff

Yeah.

Bryan Alexander

Um, I mean, this is a heck of a way to build a railroad, but um, but I I think you know it is possible that we could see uh units within um different colleges and universities pick up different forms of discipline in either sense, uh, and for example, to have a uh a writing department or everyone involved in writing across the curriculum think of a very strong no AI stance. Um, you know, you could see uh other universities like uh Ohio State University, excuse me, the Ohio State University, uh, which are committed to something they call AI fluency. Um I mean, I'm I'm very interested in how AI literacy gets promulgated in different campuses. That's a kind of baseline awareness of AI.

Bryan Alexander

Um but I I would go back to the to uh David's fourth template here, which I think is uh also likely, which is the idea of weirdness and transformation. Uh it seems right now that nobody on earth has a handle on AI. I mean, businesses are still trying to figure out how it connects with work and where savings can be had. Um, you know, governments are trying this. And the only the only success story seems to be that the Iranian diplomatic corps has mastered using Lego themed video parodies of Trump. Um and uh otherwise people are still trying to figure out how to use this for planning, for propaganda, and so on. In the higher ed, there's no consensus. There aren't any best practices. I mean, it's it's it's wide open.

Bryan Alexander

So, in that sense, it's possible higher ed appears as the world's guide to the weirdness of AI. So that happens internally to students who are trying to figure out well, where how what kind of career do I have? What does meaning look like in AI world? And we help through our research and our teaching. It may be as well that we help with our public education, that people might publish books like you and your colleagues, for example. Uh, but also that we do other things, having a podcast like this, making a TikTok, getting on NPR or whatever. But academia has this tremendous depth of intellectual horsepower that we can bring to bear to help the world figure out what the post-AI world looks like. I mean, nobody else on earth, I think, has that level of intellectual capacity. I mean, you know, you think Hollywood can give you some images, some are stupid, some are useful, but not many. Uh, governments are clearly way behind the eight ball on this. Uh, you know, we need um, I mean, academia can really play a role. So that would be the the fourth scenario.

Derek Bruff

Okay. And I what I'm also hearing is that while I mean, you make a this notion that we're leaving it up to the adjuncts to figure this out, right? Just because of the way that universities are structured, colleges are structured, the way that uh at administrative or organizational levels decisions are not being made. Um I, you know, I know of individual faculty who are starting to land on a disciplined approach to AI. They they have a sense of what it means for their teaching, but I've also observed that are there are very few departments, colleges, or schools that are making coordinated efforts to figure that out. Um and so I think that's that's well said. Yeah, that's really good. That that that means the discipline future is harder to get to if we're if we don't have it operating at all levels of of the the industry um and not just the individual level. Um yeah. There's a little bit. Right. Yeah, yeah, yeah. And it's and it and as you write in the book, there's there's the the kind of trajectory that we're on of of decline. And does this provide an opportunity to, I mean, one could do further decline uh uh uh in response to AI, or one could kind of ride these waves and perhaps reverse that trend, at least at a particular institution.

What are hypermodernism and demodernism?

Bryan Alexander

That's what I'm hoping. I mean, this is the part of the book where I bring up what I call three huge ideas, you know, and uh and how those big ideas or movements might further shape peak higher education, how it responds. I mean, one is AI, one is climate change, and another is this idea I've called the battle for the future, or two divergent ideas about the future. And I I I brought these in in part because I wanted to in my work, I'm very keen on bringing global context into higher education. The higher education universe is very, very rich, very, very intense. And it's it's very easy for us to just focus on what's going on in our campuses, with our publishers, in our grounds. And that the there are a lot of good reasons for that. But we are also the prey of the world. I mean, we are we are in in this context where global challenges, global changes, and forces impact and shape us. I mean, that's one reason. But the other is that we can respond again through the methods I mentioned before, through our research, through our scholarship, through town gown relations, through public scholarship. Um, you know, and higher education has always been Janus faced in this. On the one hand, we're very internal, we're very inwards focused, but at the same time, we do have a public mission. So I, you know, I'm I'm kind of unabashedly in favor of that second approach.

Derek Bruff

Yeah, yeah. Well, and we I think we have a knack at handling complexity well, and these are very complex challenges. Um let's let's turn to that that battle that you mentioned, the kind of the two ideologies um that you've called hypermodernism and demodernism. What do you mean by those terms and and and why why did this make the book? Why why is this a useful um framework for thinking about the future of higher ed?

Bryan Alexander

Well, partly for personal reasons. I can't stop thinking about this. I I I I the this this topic has been burning me up, and I'm I've got a book proposal. I'm starting to shop around about this because I think about this literally every day. And so, to explain for listeners who haven't had a chance to read the book yet, um the idea is that there are two very, very opposed ideas for how humanity should proceed. And they are along the lines of what progress means. And one of them, which I've nicknamed the hypermodernist idea, holds that we need to keep making progress, that everything we've been doing for the past hundred, two hundred years of industrial revolutions, we need to just keep doing that. So we need to keep expanding, expanding the human lifespan, uh growing the body of knowledge, reaching out into space, becoming richer, coming up with new technologies, you know, making the whole human experience bigger, bolder, better in all directions. I mean, one endpoint of that in the imagination would be the Star Trek world, right? Post-scarcity and so on.

Bryan Alexander

And then opposed to this is an idea that I've nicknamed demodernist. And this is an idea that says, okay, we need to pause because all this progress has done some great stuff. I mean, anesthetics are good and all of this, but but we've done terrible damage. You know, we've we've broken the Earth system through climate change. We're dealing with overshooting uh planetary boundaries. Uh the past 200 years also saw planetary colonialism. You know, we're still dealing with all kinds of authoritarian politics now. We need to just pause all of this. We're at a great moment. Let's let's just pause and let's fix all these problems first. Uh, my friend Bill McKibben has a great little book on this called Falter, where he says this is the ethical thing to do. But there are other demodernists who say this isn't just the ethical choice, but this is something we might not have a choice on. That is all these problems that they just described are so deep and persistent that we're gonna have to grapple them. And this is going to mean some kind of shrinkage. And it might be what European economists refer to as a no-growth or degrowth economy, where the economy shrinks. Uh, there's a Swiss project uh that I'm really interested in where people try to figure out how to limit their electricity consumption each day. Uh, this ties in in part to uh demographics, where you see our overpopulation starting to come to a point where the total human population begins to shrink. Um, and and this can get very dark. They can say that this is going to be a catastrophe. This is going to be a major crash. Uh, there's a very challenging and disturbing book called Hospice and Modernity, uh, which argues that modernity is is done, that it's past its sell-by date. And what we have to do now is just kind of minimize the catastrophe and prepare the next generation for what comes next.

Bryan Alexander

So I think these demodernists and hypermodernists are massively opposed to each other. And to be clear, the these are incipient ideas, the ideas that I'm tracing through multiple parts of the world, from pop culture to political economy to the technology world to journalism. And no one's out there waving a flag saying we're the modern hypermodernists yet. Um and these two ways of thinking may mutate over the next generation, depending on how they progress. But I think these are these are vital when it comes to higher ed for a few reasons. I mean, one is they describe futures for higher ed.

Bryan Alexander

So does higher ed continue to grow? Um the uh journalist Malcolm Gladwell had a university president on his podcast once, and he asked the president, I'm gonna paraphrase here, when would you say no to a charitable gift? And the president said, never. So when would you have enough money? So we always need more money. And it was a very, very wealthy school, I think it was uh Stanford. Um and there's that model that you know, that that univers a given university or college should simply keep growing, that we should grow our student body, that we should grow if you have an endowment, to grow your endowment, you know, to grow your budget, to keep building knowledge, to grow your curriculum, possibly to grow your physical footprint, maybe your virtual footprint as well. Uh and for the past you know 50 years, that's been our kind of default model.

What might happen if institutions stopped trying to be all things to all students?

Bryan Alexander

Maybe, though, the demodern idea is that if you're in a state like, say, all the Northeast or the Midwest, where your population is starting to shrink, you think maybe your college footprint should shrink as well. Maybe you should go what one president friend of mine calls asset light and just reduce your physical bricks and mortar footprint and get used to instead of having 10,000 students, you're gonna have 8,000 and figure out how to do that in a humane way that preserves your college mission. But also these two ideologies are ones that higher education works on. Uh when I mentioned degrowth or no-growth economics, these are economic ideas which say that we need to pause GDP growth and we should reduce it. And these come from, among other places, academic economists and people in poli sci. Uh, when we think about uh analyzing, say, the climate crisis. I mean, this is across the board. We have people everywhere from environmental studies to sociology to religious studies taking a look at this. Uh, and we get to teach this. I mean, sometimes we've been doing this perhaps unconsciously. Um, you think about uh business schools, uh, which are be like, you know, all about growth. Can I grow your business? Can I make you know keep growing shareholder value? We've been teaching that into the world. Uh, and then people who argue for uh the opposite, we also teach that when we teach voluntary simplicity or when we teach economic or environmental repair. So I I think this is this is a deep and fascinating divide that higher that impacts higher ed and that higher education could really develop.

Derek Bruff

So I want to ask about the the demodernist first. Um, one of the things it's a it's a conjecture I keep hearing from journalist and author Jeff Selingo. I'm sure others make this argument, but he he he seems to mention it on the regular. And it's the notion that there are too many colleges and universities trying to be all the things to all the people, or or you know, uh too many universities that are trying to emulate the research ones and not leaning into their own identity, their own strengths. And I'm wondering, you know, he's arguing that that colleges and that institutions should should not try to do all of that, but to but maybe even kind of downsize a little bit, but but to kind of find their niche and occupy it really well. And I'm wondering, do you do you see that as a kind of managed descent in the kind of the modernist tradition?

Bryan Alexander

It can be. And just to be clear, that a university or college should find its own niche has been a staple of consulting and strategy for about 20 years now. And the the usual positive idea, or the hypermoderist idea, if you will, is that a university that can do that will succeed, that that will bring in more dollars, more students, and that'll be a recipe for growth. Um, the demodernist take would be that's your only way forward. Uh, and that you know, every state school shouldn't try to be Harvard, every liberal arts college shouldn't try to have a full university curriculum, that you should pick your and whatever the niche is, it could be very, very small. There was a school on the West Coast in Oregon's wine country, which started a viniculture studies program, which is doing really well because they've got great wine culture around the area, and they bring in people who teach in business and people who teach in earth science and so on. Uh, it could be something larger. So you think about a school like um you have uh schools that have music conservatories, like say Berkeley College of Music or Eastman, you could think about schools like Middlebury College, which have a great language school. Uh it might be that uh a campus has a reputation for focusing on blended learning or all kinds of possibilities. Uh some of them might be hyper-local to their region, um, some of them might be national. Uh, but that is that is one approach. Um the difficulty is though that we have a lot of universities suffer from Harvard envy. Uh a lot of our faculty and a lot of our staff come from Research One universities, and so bring a lot of those habits, expectations. Um, and also we love growing stuff, we love adding new programs. We don't like shutting them down or or uh merging them. Uh we might learn how to do that.

Derek Bruff

So, okay, so that's interesting. So this this kind of specialization argument, um there is a growth argument for that, a support from the kind of hypermodernist approach. There's also a way to think about that as a demodernist approach. Um yeah. So um

Bryan Alexander

you see what I mean. This is this is yeah, this is hard to unsee once once you hear it.

Derek Bruff

Yeah. Well, and what I'm hearing, what I I mean, through this lens, I I see so many universities that are are kind of torn between these two ideologies, right? On the same campus, you'll have pockets of grow, grow, grow, and you'll have pockets of we've grown too much. We we need to fix problems, right? Um, and I think in some ways AI is a uh highlights that, right? It the the kind of debates around AI shine a spotlight on that.

How might AI affect the career readiness aspect of higher ed's teaching mission?

Bryan Alexander

I was gonna say one of the one argument for AI is that we can use it to help solve uh the climate crisis, uh, that AI will help us handle the incredibly vast oceans of data and complex data, uh, and it will help give us policies and ideas and projects that, you know, when you take a campus and you cover it with sensors so that people can see you know all the different varieties of temperature, of moisture, of light, um, that AI can help with all that. I mean, this is one argument. Um, and the other argument, of course, that's better known, is is that AI has an enormous electricity footprint, an enormous water footprint, um, and also soaks up all kinds of uh of materials, especially rare earths, um, and so that we need to stop all that growth. Uh I mean, we're starting to see this this this you know this two cultures divide, if you will, start to play out in terms of how we think about AI. I live in Northern Virginia, which is just data center alley. Um there's a the gym that my wife and I go to every morning. Um it's about three miles away, and we pass now about six data centers on the way. And the locals here don't like it anymore. They're uh and so there's a lot of upset about that, even though they are visible symbols of of growth. So it's a it's a fascinating time. Uh and I I think we need to look ahead further to see where this could go.

Derek Bruff

So I think one of the the I keep wanting you to tell me what the future holds, but that's not the goal here, right? The goal is to kind of equip us with tools to understand and see trade-offs and make connections. Um and I guess I want to ask you, I'll ask you this one last question, um, because this comes up in my day job a lot. Uh, I know a lot of liberal arts faculty who look at the ways that AI is uh affecting professional work broadly. Um and some faculty see that and are like, oh, we need to cut, we need to prepare our students to enter that workforce. And that means we have to grapple with AI in certain ways. But I also know a lot of faculty who are like, that's not the goal of a liberal education. We're not here to prepare you for a workforce. We're here, we're here to help you learn how to think critically, to read deeply, write, to be engaged in in the civic life. Um and so how would you apply some of these frameworks to that tension that I know a lot of faculty are experiencing between wanting to help students, but also not wanting to kind of turn their jobs into workforce development?

Bryan Alexander

A lot of academics will see AI as being it is there, it is undeniable, it is growing, we think it's shaping the world, we need to work on it and figure out how to work with it. Uh the opponents often come from the humanities. The uh people in writing and composition, for example, are very fiercely uh opposed to this. Uh, anecdotally, I see a lot of classical studies scholars who are very anti-AI. And they often make a very political uh argument. Uh, you know, their criticisms of AI are partly tactical or immediate based on what they think this is doing to their the physical space of their classroom and their student learning. But they also will bring in, uh I mentioned climate change before, uh, they'll also bring in bias, uh, they'll also bring in labor issues, uh, both the impact on paid labor as well as the misuse of labor in uh training AI. And again, these are very, very uh political, very progressive uh critiques to make. Uh, and it may be one thing that I'm anticipating is that we'll see AI on campus become much more political. Because outside of the academy, uh, the Republican Party is largely, not entirely, but largely very pro-AI. You got that from Project 2025. Uh, the current president is very pro-AI. Um, the government administration is pro-AI, although that plays out in different ways. Uh, and the opposition is, like I said, very progressive. So the Democratic Party has yet to turn on AI, um, but uh for various reasons we could talk about, but it's possible that we'll see that split play out in campus. Um, and it may play out you know elsewhere to have someone, you know, uh protest a data center saying, you know, accusing it of being a right-wing plot. Um so I think that's one thing that we're seeing on campus is that impending political, you know, political reading of this.

Bryan Alexander

Um but also we have other issues. One that fascinates me is that it's very difficult to structure AI on campus, and by that I mean to figure out who's responsible for it and how do you how do you handle it. So some campuses have been uh opening and filling AI czar jobs. That's a nickname, but you know, a dean of AI, an officer of AI, a friend of mine is an AI integration uh officer at a community college, uh, and they, you know, that's all these functions they have. Um or you may pick a committee. So rather than a single person, it's the committee for AI. Or you may have a set of committees that work together. So I've seen one campus where there are 10 different ones charged with this. It may be that the chief academic officer, the academic dean or provost takes the lead. Uh I mean, the there may be a center. So the most extreme example of this is at uh Bowdoin, where uh uh a Netflix co-founder gave them a lot of money to set up an AI in the humanities center, so you've got that. Um there's no and no one's figured this out. I mean, there's it's it's interesting to see which of these will will play out. Of course, it varies from campus to campus based on their culture, their finances, their approach.

Bryan Alexander

Um but also I I I think a lot of this is is this sense of and this goes back to centuries in American higher ed. What is the purpose of higher ed? Uh is it a kind of private business-oriented good? And I found that for the past 30 years, that has really taken off. And one reason is the financing of higher ed, where we've added this enormous amount of debt where published prices have soared. Now, actual what's actually paid is a different matter, but you know, this last year we had two campuses crack the $100,000 per year barrier for the first time. Uh, and we should expect a whole suite of them to follow suit uh this summer. Um and so students who see that and also come from a world where they're uh which is very market-oriented, uh, for them, seeing college as a transaction which will leave them a better job simply makes sense. Uh and I find some faculty and staff are disconnected from that reality and they don't quite see it.

Bryan Alexander

Um the other hand, you know, college is an inward-looking uh space apart from the world to some key degrees, uh, where we are, we can kind of suspend a lot of the operations of the market, where we could try to have spaces where young Derek Bruff can explore what curiosity, what curiosity takes him. Does it take him to Russian literature? Does it take him to photons? Uh what you know, we we have that moment and we can focus on that. Uh developing students for who they are. I mean, there are other there are other natures involved, uh, other re other ways to take this. Uh, you know, you can have a non-economic reason for being fascinated by AI, for example.

Bryan Alexander

Uh, we could also think about the public goods that we have some kind of consensus on, you know, that higher education is about making, among other things, making people physically and mentally healthier. So they will be better in society, they'll be happier, but also, you know, less of a burden on our uh public health system. But also that we'll be more civically oriented, we'll be better equipped to vote, to march, to uh contribute, to run for office, which is better, you know, all around. Um it it's an open question if we're still committed to that right now. But but you know, but we we do and and and we're opposed in this in some extent. I mean, the Trump administration is pretty clear they want fewer students to vote this fall. Uh, and so they've taken some steps against uh Tufts University and the National Student Clearinghouse to reduce that.

How can the work of futurists build agency?

Bryan Alexander

Um but it's um I I think in many ways the way these these debate over AI plays out, it happens very explicitly at the level of policies and individual campuses, individual classes, but also it strikes us in very deep subterranean ways. Uh I mean to pick one example, the idea of an AI as a companion bot that we can spend time with. So not just you know, give me a recipe or help me with my essay, but tell me a joke, or I'm I'm feeling lonely and I don't know what to do about it, or you know, I'm worried about my partner, what should I do? Um Brent Anders, uh the great uh AI literacy guru at the American University of Armenia, once told us that he had some of his students reporting spending up to six hours a day with chatbots. Um how do we think about this? What kind of you know, what does it mean to have these aliens in our midst? Um Dario Amodei, the uh founder of Anthropics, said that we should think about AI now as the arrival, I don't know if you saw this, uh, the arrival of a country of 30 million geniuses. Oh, wow. And just appear next to your country. You know, what do you do with that? Uh how do you is that is that a threat? Is that someone that you make treaties with? Is that someone that you try to avoid? Uh I mean, the AI revolution really hits us in so many, so many registers. Uh, that's just I'm I'm I'm looking forward to uh your new book on this in part to help us tackle uh a lot of those registers on campus. 

Derek Bruff

Yeah. Well, Brian, um, thank you for coming on the show. And um, I think I heard you say, on the other hand, about a half dozen times, because um, I think what you're doing is you're helping us understand the complexity. And it's easy for us in academia to kind of stand where we are and see everything from our positionality and our our kind of values. Um, and I like how you kind of take us up a level and help us understand the systems and the possibilities that are there and the weirdness that is happening. Um, you're not shy about tackling the weirdness. So I appreciate I appreciate you doing that. Um Yeah, thanks so much for being here and for sharing a little bit. Um yeah, this has been a great conversation.

Bryan Alexander

Can I leave one last note about that?

Derek Bruff

I would love that. Yeah.

Outro

Bryan Alexander

Well there there's a long tradition of futurists who describe themselves as predictors, you know, that this will happen next. You know, Yuvol Harari does this, for example. HG Wells did it sometimes. Uh ATT used to have a commercial series about this will happen in the future. And I'm I'm part of the other school of futurists who think that this is all about giving people options and giving people visions and possibilities that they can choose from. So that people have agency, that their choices matter. So when you look ahead to climate change, to AI, to progress, to these questions of enrollment and demographics, that you, whoever you are, a student, a staff member, faculty member, president, board member, state legislator, uh business operator, that you have uh options that you can choose from and you have visions. You can see where this might go. And I want to make sure that everybody listening to this feels that sense of agency. Agency is uneven, it's conflicted, but we all have these potentials for making choices so we can build this future together.

Derek Bruff

It's a great place to leave it. Thank you so much, Brian.

Bryan Alexander

My pleasure, Derek. Always a pleasure. 

Derek Bruff

Thank you to Bryan Alexander, senior scholar at Georgetown University and author of the new book, Peak Higher Education: How to Survive the Looming Academic Crisis from Johns Hopkins University Press. I always come away from my conversations with Brian with a deeper understanding of how higher ed works and how it could work in the future, and I hope you gain similar insights from our conversation today.

Derek Bruff

For more from Brian, visit his website, Brianalexander.org. Check out the Future Trends Forum, a live video conversation space that Brian has been hosting every Thursday for years, and read his books. Peak Higher Ed is his latest, but he has also written Universities on Fire about higher ed and the climate crisis, Academia Next, exploring other possible futures of higher ed. And the New Digital Storytelling, his 2017 book on educational applications of new media and narrative. See the show notes for links to all of this.

Derek Bruff

Intentional Teaching is sponsored by UPCEA, the Online and Professional Education Association. In the show notes, you'll find a link to the UPCEA website where you can find out about their research, networking opportunities, and professional development offerings. This episode of Intentional Teaching was produced and edited by me, Derek Bruff. See the show notes for links to my website and socials, and to the Intentional Teaching newsletter, which goes out most weeks on Thursday or Friday. If you found this or any episode of Intentional Teaching useful, would you consider sharing it with a colleague? That would mean a lot. As always, thanks for listening.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Tea for Teaching Artwork

Tea for Teaching

John Kane and Rebecca Mushtare
Teaching in Higher Ed Artwork

Teaching in Higher Ed

Bonni Stachowiak
Dead Ideas in Teaching and Learning Artwork

Dead Ideas in Teaching and Learning

Columbia University Center for Teaching and Learning
My Robot Teacher Artwork

My Robot Teacher

myrobotteacher
Learning Curve Artwork

Learning Curve

Jeff Young
The American Birding Podcast Artwork

The American Birding Podcast

American Birding Association