AI Education at Dartmouth College

ABOUT THIS EPISODE

Professor Dan Rockmore, Director of the Neukom Institute for Computational Science at Dartmouth College joins the podcast to discuss the AI education maturity curve, the role of the academy vs. industry when it comes to artificial intelligence, and the economic reality when industry is willing to pay our AI faculty far more than we can.

You're listening to enrollment growth university from Helix Education, the best professional development podcast for higher education leaders looking to grow enrollment at their college or university. Whether you're looking for fresh enrollment growth techniques and strategies or tools and resources, you've come to the right place. Let's get into the show. Welcome back to enrollment growth university, a proud member of the connect e D U podcast network. I'm Eric Olsen with Helix Education and we're here today with Professor Dan Rockmore, director of the Newcomb Institute for Computational Science at Dartmouth College. Dan, welcome to the show. Thanks, Eric. I'm looking forward to the conversation. Likewise, so excited to talk with you today about the AI education maturity curve in Higher Ed. But before we dig into that, can you give the listeners a little background on both Dartmouth College and your role there? Sure Dartmouth College is an Ivy League institution or committed to the liberal arts and like to think at the at the highest levels. We have graduate programs pretty much only in the sciences and whatever. We're the smallest of the I vys. We have about undergraduates and I'm a professor of mathematics and computer science. I work in machine learning and computational kinds of mathematics and I am, as you mentioned, I direct the Newcom Institute for Computational Science whose mission is to spread computational or whatever, spread the reach of computational ideas throughout campus. So that's me beautiful mission and and you are a perfect person for this conversation, not only because the wonderful work that you and your team are doing now, but the legacy work that Dartmouth is always done in this space. So to kick us off, in the Dartmouth Summer Research Project on artificial intelligence was someone say, the origin story of some of the very beginnings of Ai Research. How does Dartmouth think about their educational role in the field of artificial intelligence today? So Ai is spread out throughout the curriculum and a little bit throughout departments as it becomes relevant, from many different disciplines, including some in the humanities as well as the social sciences. So it's not just the sciences and I although the concentration of our teaching of AI is in the computer science department, like in pretty much all of our departments and programs, we aim to provide a top level education for our undergrads and, where relevant, also giving graduate students access to the same source of ideas at the levels in which they need. I like to think that we're graduating students who know as much about artificial intelligence and machine learning as students coming out of pretty much anywhere, maybe accepting a place like m I t or something, but maybe not, because our students also get plenty of opportunities apply those ideas in research projects. We have a...

...very strong, very vibrant program in pairing students with faculty on publishable research, and as a director of the new constitute, I see a lot of these things. For from faculty. We're looking for funding for students to work on these projects. And so, short story, we're trying to provide students with an appropriate appropriate to their previous training, level of training, teaching in artificial intelligence so that they can take the next step to graduate school or industry or whatever it is they want to do. Yeah, in order to do this well, in order to teach ai well, we obviously you obviously need great ai subject matter experts, Great Ai Faculty. Talk About Higher Ed's potential problem in securing AI faculty in this growing industry. When folks who have these talents, that economic value to industry right now is so incredibly compelling, it can be a challenge to hire in this space, assuming that the only thing that the person is thinking about is their salary. I mean, although actually sometimes the biggest, the most compelling reason for some of the top players to go to industry is that industry is also the keeper of some of the most interesting data um which people are trying out their artificial intelligence kinds of ideas and looking for applications. So even if somebody is not just, quote unquote, in it for the money, sometimes they're they're choosing between industry and the academy because they see more interesting research opportunities, and that's a challenge actually that money can't overcome, because no matter how much money we give them, a company is still going to hang onto their data and even if you can strike up a partnership with a company, they almost surely won't let you publish your work about that product. That's their trade secret. And so you're you're kind of making I mean it's a little bit of faucium bargain and you're doing all kinds of great work and if you're just happy to see it in an industry or products setting. That's fine, but the ideas will probably never see the light of day. So that's the challenge. On the other hand, we have certainly succeeded in hiring great young minds to come to dartmouths to teach artificial intelligence kinds of subjects. They tend to be collaborative, energetic young faculty who are energized by the idea of teaching and interacting with students and the freedoms that the academy allows, certainly after tenure, where you can pursue anything that you want. So it's so the challenge isn't insurmountable because there are these two kinds of people. I mean not everybody is driven by money. There's also there. I mean they're also lifestyle considerations. So you've got to put together the whole package, which is to support them well enough to do work at the scale that...

...they need to do it. You know, sometimes you have to provide extra teaching kinds of accommodations because that's part of the package too. But you know, the ones who are honestly interested, like I said, they're I mean they're interested in it for more than the money they're interested in it for bringing along the next generation of students. They're interested in IT and coming to the academy for the freedom and the broader intellectual environment, at least a dartmouth where we have a pretty well connected, close knit intellectual community. I loved how you addressed all the different motivating factors that someone might have for choosing industry versus the academy, including being on the cutting edge of where this is going. And I know there's that story from a few years back about Carnegie Mellon's partnership with Uber, and then Uber ended up wanting to not just create this new this big graduate pool of students who could work there, but they ended up grabbing a bunch of the faculty and bring them on board. And to your point, you know, we don't begrudge folks from making economic decisions, but it was probably also a realistically more exciting one for them at the time to do what Uber was doing with the with data science and artificial intelligence. And so you similarly are having the different story. You're attracting a lot of young AI minds to be a part of this mission work of creating the next generation. How do you think about this? We we might have two sides of the coin here. We have a lot of folks drawn to industry and we have this AI field and talent emerging out of industry, versus, on your end, this talent, this education, this thought, coming out of the academy. What are the pros, what are the cons of that happening simultaneously in both spaces? Well, that's a big question with a lot of parts to it, you know. So the best educations, and this isn't my perspective, and that I should say not everybody would agree with me. The best educations, I like to think I mean, are the are what are kind of classical liberal arts education model, which is that you sample among the traditional divisions of the academy and focus on a subject, possibly to in order to graduate them in terms of a major. And I think that's a good model because, and certainly with respect to stem and actually anything which has huge social implications, because there are all kinds of unplanned for unexpected side effects when you deployed significant technologies into the world in terms of products or policies or things like that. And if you don't have a broad view of the world, if you don't have a broad view of the world of ideas and history, and I mean just keep going culture. Then you start off blind and then you get blindsided by, you know, one of your products being used in a pretty terrible way. That may have industry implications, but generally speaking, just in terms of the way society is going, that you whatever. You...

...put something out there and you didn't think it was going to go that way and honestly, if you had read a little bit you wouldn't have been so surprised. But you know and that happens, and so that's why I'm I'm a big believer in the liberal arts and the broad education. You know, you can then ask the question is that going to so first of all, are your students going to be well prepared enough to take on the best jobs? I think I've already said before that I think they are. There may be a slightly, you know, a steeper learning curve for some of them, but but these things go in and out of fashion. I can remember years ago where all the tech companies were saying that, and I'll just take them at their word, that they really wanted liberal arts people because they knew how to think generally. Somebody who's just done programming, for example, for four years. I mean they will know more skills than just programming. There is an aspect of critical thought, all these things. But will they be as flexible to a world in which technologies change rapidly? Not Clear, because they've potentially been narrowly trained and suddenly everything is different. So there's this you know, you have to kind of walk the line. You have to educate for flexibility as well as skill, and you know so and a pure industry kind of teaching but anyway, you know a pure industry product focused teaching platform will probably mean less flexible employees. And you know, a very, very broad education sure is going to require a little bit more education training once you hire the people. But you know, then at least you know you have a background to flexibility. So that's why I'm just I'm still a proponent of the broad education, even though people, including students, are impatient. I think that's I mean, it's just true and and and I think it's the role of administrators, of teachers, of everyone, to hate to say it this way, but to be the parent in the room and say look, I know you to get out there and do all these things and only those things. But you've kind of got to also eat some of the intellectual vegetables too, and if you think about it, you know they actually, actually won't taste so bad and you might actually grow to love them. So anyway, that's my that's my long answer to your long question. I think I hit most of the stuff you teed up. You hit the nail on the head. Let's let's talk about patients again in this concept of a student's expectation when they are coming to campus in terms of what ai can be, who might inevitably be versus where we are now in that majority curve. I think a lot of our concepts of Ai have come from science fiction. I think autonomous driving was a big woe for a lot of us, and yet the road isn't full of them quite yet to the degree that we might have thought five years ago. And then we see what open ay is doing with natural language processing and now with Dolly, and that's kind of blowing our minds to...

...again. But I think we have this kind of weird impatience of where we will eventually be versus where we are now, and your students especially and your faculty need to help us bridge that gap. So when your faculty are talking with you. What are you excited about right now versus? What are you actually spending your time thinking about, and what are we prepping our students on today? In regards to AI, you know, everybody wants to learn. Everybody wants to know deep learning these days, I mean that's the that's the engine that's driving, for example, most of the large language models that you, I think you you referred to, and it's making its way into everything. It's kind of the all encompassing pattern matcher in some way, and I mean that's and so students generally, well, I mean they're there are two kinds of students. There are students who want to be able to say that they learned how to program, because and learning how to program is important. I think, whether or not you're actually going to program, you should at least know, if you end up in such a position, what the folks working for you are working with, you have to go through to produce working implementation. And but yeah, I mean so, I mean students, deep learning is the sexy thing these days, and then figuring out all kinds of different ways to deploy it. And usually then the art is figuring out how the data that you're acquiring or how the phenomenon that you're interested in is producing data in such a way that you can feed it to one of these algorithms and get a sensible result. And when you talk about in patients and patients, usually surfaces in terms of people having crappy data, people having biased data. I mean, no matter how great the machine, it's still a garbage in, garbage out kind of device. They're actually not intelligent, they're not sentient by any measure. They might not be able to predict what the machine is going to do because, for example, the models have, you know, hundreds of billions of parameters, so they are kind of unpredictable, or at least unknowable in some fashion. But you know, being careful about what you're giving the machine in order to get the kind of information you want is probably the most important thing that students need to know and maybe something that's alided from time to time. For Myself, I'm interested in whether or not a machine honestly can write a good poem. That's something that I'm very interested in these days. Setting aside what good poem means, it's a larger question kind of about what are the limits or the horizons of possible things that feel like creativity and a machine. And so I'm focused on a very small phenomenon that I that's my colleagues and I feel like we may be able to get a purchase on. So that's that's kind of a funny little problem that I've been understanding. The poetic capabilities of these large language models is maybe a better way to say it. I think related, because it certainly related to the educational space. I think that I think one of the great educational challenges now is...

...that, given the accessibility to such high performing algorithms to produce things that feel like answers and thinking, how do we give sensible homework assignments to students? How do we test students when they have it there literally at their fingertips, digital assistance, with so much computing power and smart algorithms that a lot of standard testing mechanisms are really easily subverted. So that's I think that's going to be a growing challenge in education. I wouldn't even be surprised if to the point that we might wonder whether grades are worth it at all. So anyway, that's a that's a provocative kind of statement sort through with someone. So you mentioned that today we're still living in a world where, if we take industry at its word, they're still relying on us to create and train graduates who can think. Well, broadly, that they could either, you know, pluck our students right out of high school and come to a company where the mission is don't be evil, or they could go to Dartmouth and talk through what is evil. But I'm curious if we're going to have this issue where, or do you think we'll have this issue where hire it is able to move fast enough? Some of these industries, AI specifically, is so rapidly evolving and we when we look at the places that are moving the fastest and are playing with the most data, perhaps you could argue it's it's an industry versus the academy today. How can we make sure that we're moving fast enough to lead the way continually in these spaces? I mean there's a question of sort of what you want to lead, you know. I mean I don't know what ubers capitalize that these days, but sure they can have a campus, probably this and maybe they have one, you know, the size of Dartmouth College, with everybody. They're focused on self driving cars. They could do it. That's going to be you know, of course they're gonna. I mean, and were they interested in it, they would produce more publishable work about self driving cars, I'm sure, on Dartmouth and probably a lot of places. I mean the company. These companies are capitalized so largely that any one of them could stand up a Manhattan project on any particular problem that they want to. And look, I mean it's a question of what race you think you're playing in. You know, if you're our mission is too, I believe, and I should say I don't know everybody agrees with me, but I mean, our our mission is to train the next generation of critical thinkers to inherit this planet that we're, you know, destroying and to try to figure out ways to make their lives and their and their children's lives, you know, good ones in this place like. That's our mission, I think. And along the way that I mean that has many dimensions.

Know, studying biology, studying ecology, study and studying computer science, studying mathematics, studying literature, I mean, studying all these things that you know, both make you know, I mean for all kinds of purposes. You know, I'm I'm not really sure where the weight should be cast. If you're if you have your if you have if your laser focused on market cap and your stock price and bringing technologies to bear on the market segment that your company's in, that's a different mission and some of our students, you know, will happily sort to that and and and that's great, but I actually don't even believe that we're playing in the same space as companies, even though, you know, I mean whatever, it is a competition, I guess, in terms of attracting research dollars and being the first one with a with an interesting discovery. So you know, honestly, I feel like that kind of question in particular, it's just like looking over it's looking in the wrong backyard, honestly, and that for what it's worth. I think that often institutions aren't focused enough on why they exist, I mean higher it institutions, and it really isn't to create the next Google. So that's, like I said, that's my opinion. Dan, wonderful stuff. Finally, leave us with some next steps. Advice for institutions trying to grow out their own AI programs and departments. See this market demand when they're thinking about recruiting enrolling, what that pedagogical focus should be. How should they think about this challenge? You know, I mean you can't do everything first of all. So you have to figure out how big you want to be. I mean some of it really is just the back of the envelope calculation. If you have a faculty of a given size and you know you can throw, you know, so much money at it and these people teach x number of courses, I mean really like in the weeds, you know, in terms of a calculation. You know, what can you have forward to teach the students? The guts of a of a good computer science program are, I mean, it's not huge and you just have to be smart about how you're how you're teaching it. And there are plenty. There's so much open software now. I mean having faculty who know how to use all the resources that are available, I think is important. I still believe, as I keep saying, I mean, you know, breath, some degree of breath, is important because in a couple of years there will be another, you know, amazing suite of Algorithms that lawyer students will need to know. So, you know, you just have to. I mean that that is the one thing about computer science. I mean you probably like a ten year plan probably doesn't make any sense. You know, five year plan makes sense. And then I realizing that, as much as administrators might hate to see it, you know, you may have to have a different kind of what of her system,...

...you know, to hire people whose market demand is potentially so, so high. But that said, you know, think about the think about the environmental advantages that you can give to your faculty. You know, build a strong interdisciplinary cohort of scholars. That's that's one thing that I found is very attractive to new faculty coming to Dartmouth Anyway. You know, the ability to work across disciplines. Do your best to break down those kinds of boundaries. Institutes of computational thinking can do that. That's one way of doing it, because lots of people use computing. So just giving them an environment that is exciting in a way that maybe a narrowly focused company can't. And, as cookie as it sounds, honestly do good, don't pretend to do good like Google or facebook. Honestly do good. And there is a generation of like the coming generations. Actually they like they care about the planet, they care about communities and that coming generation are your next generation of faculty hires. So play that card, because most of the companies aren't doing it and so we can still do that and some people might say, you know what, I think that's the thing I want to contribute to and you'll get them. Dan, really great stuff. Grateful for your time, grateful for your wisdom. Keep up the wonderful work and thanks so much for joining us today. Okay, thanks, sir. That was fun. I hope your listeners enjoyed it too. Attracting today's new post traditional learners means adopting new enrollment strategies. Helix educations data driven, enterprise wide approach to enrollment growth is uniquely helping colleges and universities thrive in this new education landscape, and Helix has just published the second edition of their enrollment growth playbook with brand new content on how institutions can solve today's most pressing enrollment growth challenges. Download it today for free at Helix Education Dot Com. On slash playbook. You've been listening to enrollment growth university from Helix Education. To ensure that you never miss an episode, subscribe to the show on Itunes or your favorite podcast player. Thank you so much for listening. Until next time,.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (264)