How to Increase Online Student Engagement

ABOUT THIS EPISODE

Dr. Unnati Narang, Assistant Professor of Marketing at The University of Illinois at Urbana-Champaign joins the podcast to talk about the first week of class prompt that improved student engagement by 30%, other pedagogical tests that didn’t move the needle at all, and how to make sure we’re all learning off each other’s online pedagogy experiment curves.

Check out this resource:

The “Idea Advantage”: How Content Sharing Strategies Impact Engagement in Online Learning Platforms

You're listening to enrollment growth university from Helix Education, the best professional development podcast for higher education leaders looking to grow enrollment at their college or university. Whether you're looking for fresh enrollment growth techniques and strategies or tools and resources, you've come to the right place. Let's get into the show. Welcome back to enrollment growth university, a proud member of the connect e D U podcast network. I'm Eric Olsen with Helix Education and we're here today with Dr Unity and our own Assistant to professor of marketing at the University of Illinois, at Urbana Champagne. Unity. Welcome to the show. Hi Eric, thank you so much for having me. Thanks so much for being here and I'm really excited to talk with you today about your new research findings on how to increase online student engagement. But before we dig into that, can you get the listeners a little background on the University of Illinois? I ad our Bandah Champagne and your role there. Sure so. I joined University of Illinois in in the middle of the pandemic after graduating with my PhD from Texas in our university in the marketing area, and one of the really appealing things about University of Illinois and especially Geese College of business, is their focus on online education and just a whole new way of looking at the future of high red. So that was something that attracted me and I've continued to research more and more about this engaging topic. I love your background and the current research work that you're doing. Let's dive in. Unity. We all hoped automation would save US nudging students with the right message at the right time. The promise was huge, but the research returns have been mixed. Today, what have we learned so far about the behavioral economics of Student Nudging in higher red? So if you...

...think about nudging, this theory was popularized really by Richard Taylor and cast some stain in their book in Two Thousand Eight. The idea is quite simple. Can you introduce subtle changes in a system or an environment that can influence the way people make decisions or judgments without really enforcing them or making them mandatory, like a policy change? Word? So, as such, we've learned a lot from different types of nudges, particularly in the education era. And when I think about education it's really important to start from the fundamental problems that we hope to resolve with some form of nudging, and that really the two biggest barriers in my mind are access and engagement. So how can we improve access to education, make it more affordable, more inclusive and, in addition, how can we make sure learners who access education continue to engage with it and get quality outcomes? And for both of these issues, while we're still at the tip of the ice pool work, the theory of nudges has been tested in various ways. So, for example, one big barrier to education is the cost. And today we all know around forty four million Americans, oh, I think upwards of seven one point seven trillion dollars in student debt. And when I went to my undergrad which I completed it in India, uh, several, maybe two decades ago, I remember my entire tuition fee was under two thousand dollars. So when I transitioned to the US for my PhD, I was shocked to see that most students graduate with a loan of thirty to forty dollars. And so there have been nudges that education researchers, economists have looked at that. For example, will send information to students about their loan options or that would send them text messages about their financial aid applications. And, like you said, the results have been mixed. So, for instance, in a recent study they sent, you know, text nudges to...

...eight hundred thousand students about their financial aid applications and there were no effects on either enrollments or student outcomes in terms of who applied for financial aids. So I think this, this idea that nudges may not generalize to every context or one nudge that may work in one setting, is not new. And this is not a new challenge with nudges because we see the same in healthcare, you know, health nudges. So this is not a unique problem in education. But what we're learning more and more is that the context matters and student heterogeneity really matters because every learner is different. So you can't expect the exact same nudge to work in every setting for every type of individual. And how do these effects scale up? Is something researchers really have to think hard about. Yeah, and you mentioned the idea of increasing student engagement and we're trying to test that. We need to test student engagements that. Can you remind us of the difficulty of pedagogical a B testing in a traditional higher red environment. Absolutely so. If you go to any kind of causal inference workshop or course, and that's my specialization, so in my PhD I took a lot of courses and now my research focuses on causal inference, they would remind you that randomization is the goal standard, because if you don't randomize there's a whole lot of self selection. We as individuals make decisions based on what we think will work for us. So if I give a treatment to certain learners and let them choose whether they want it or not, they could self select into it. For example, more motivated learners may already do better, not because of a certain intervention but because they were more motivated to begin with. And in face to face environments it's very difficult to conduct these kinds of randomized tests, which are much more low cost and easy to implement, went in online...

...settings, and the challenges are obviously in terms of cost of randomized thing, controlling all the million things that can go wrong. And you know that hasn't stopped economists and education researchers though. For example, one of the classics studies we have looked at in education is the star Tennessee project by economist Alan Krueger, and in that study they were able to for almost this is in the nine nineties, and for almost twelve thousand in person students, they were able to manipulate class sizes. So that's an example of a really powerful manipulation, very, you know, intensive and high cost, but they were able to do so. I think what's happening with the online revolution and digital revolution now is that the cost of conducting these randomized tests has come down and they're much more easy in some ways to do so. Yeah, you mentioned a couple of are seeing things there. The difficulty of self selection, bias, the difficulty of insisting and ensuring randomization. I've also seen educators split their class of twenty students into two groups and call it Naby tests and say they learned something. And we're talking about statistical significance, significant confidence. Did you really learn anything? So the volume of students that you're able to have access to in your research through the Courserra partnership and platform is a great way to get us to statistically significant levels of confidence in our learnings. But when I think about the traditional college students and students taking a mook or a courserah. Course I am thinking about what you just mentioned about true randomization and self selection bias. How comparable are these two types of students in terms of US taking these generalized learnings and assuming they will behave and work similarly throughout all of Higher Ed? Yeah, I think I'm gonna Answer your question two parts. You made a...

...very interesting point about the volume of students and online platforms and online environments. So I think to me the power of online is not just additional volume of students, but also it allows us to observe more rich actions taken by those students in a way that we couldn't have observed in face to face environments. So when I worked on my paper with Coursera, I had access to very granular information about when a learner starts watching a video, when they posit, do they speed it up while watching? Do they come back and Rewatch it? Are they binging? And even for assessments, how many attempts are they making? Are there certain points in a in an assessment or a quiz where they need more assistance or help? And so combining all of these rich data with all of our current advances and statistical and computational techniques. To me, that's the power of experimentation in the online world and when combined with rich theoretical insights and good experiment design, what we can learn can be really powerful. So there are, I guess, two things that I wanted to mention. One is, of course, the volume of the data that we can have access to, but also the richness in really understanding how learners behave, and this is happening in k through twelve education as well. I have some colleagues in marketing at University of Toronto, Dinara Actualina, who looks at k through twelve education, particularly gamification in a math learning APP, and there are others at the University of Chicago, such as Ruyan Shah, who does work at the behavioral insights and parenting lab and studies how parents and children interaction with reading can improve their literacy by giving them a tablet in a randomized fashion. So there are all these powerful questions we were not able to answer in the pre digital era. And to your second question, where you're us give me more...

...about if we learned something from learning environments that are online, how applicable are these results to the physical world, especially if you think about are the types of learners who are taking courses on Coursera. How are they different from a traditional learner? I think there are certainly differences. So when Coursera was founded by Daphne Kohler and Andrewing in two thousand twelve, they were really focusing on improving access to learners who were traditionally excluded from this face to face university education. And if you look at courses now, a typical learners profile, they would be a mid career executive, already have an Undergrad degree. They may be international for the most part, and I teach a coursera class on applying data analytics and marketing and most of my learners, I think, are employed full time, already have a degree. Many are transitional workers trying to make a career transition in and apart from us, I think most of the learners are coming from India, Mexico, Brazil, China and even within us we see a lot of mid career Americans who drive the demand for online programs, including online degree programs. In fact, there was a study published in two thousand nineteen by economist Joshua Goodman and his co authors that looked at Georgian state of technologies, online ms in computer science, which was one of the very first degree programs that combined online education elements, and they showed that mid career Americans drove the demand for these online courses, but then it also spilled over to getting more demand for in person course enrollments for that course. So overall online education seemed to be increasing access to traditional education also, and so I think that when universities look at this learner type or demographic there is a lot of applicability of what we learned from the online world into the degree programs,...

...which are more and more becoming hybrid anyway, particularly post the pandemic. I love that context and I love that Joshua Goodman's team gets a shout out. He's a friend of the podcast and has been a frequent guest, and so we love he'll be tickled that his work was sited out unity. Let's dig into the learnings here talk about I similarly saw that big mass scale texting experiment that you mentioned and was sad and disheartened to see that they couldn't find statistic confidence there in effects. But you've come across something that I think we can all benefit from talk about your recent research learnings regarding prompts during an online student's first week of class, what worked and what didn't perfect? Yeah, I'm very happy to share my own research in this area. So I started working on this project back in two thousands sixteen as a PhD student. For those of you who don't know, I used to have a higher ed at tech background even before are pursuing my PhD. I worked with Columbia's executive team to launch their in person programs online for a variety of executives, and this was a premium course offered for two thousand dollars over a period of eight weeks, but fully online. So I really during my PhD, even though I study digital platforms and Mobile APPs more broadly, I wanted to continue learning about online education, what works, what doesn't work, and one of the most cited stats about online education really is low completion rates and low engagement. So I was starting to think about what can we do to improve learner engagement in these massive online courses where most people feel lost or anonymous and don't really have a community that an in person student has access to. So I thought about these traditional strategies that work in physical classrooms where instructors will often invite students on day one of class to introduce themselves, to share something about their life with their peer, and I wondered how effective will these be in online environments? And...

...going in my hypothesis was that perhaps these would increase some sense of belongingness or community. So this was going to be a paper about how identity sharing, sharing something about who you are, might be very helpful. But when I actually conducted the study we had two conditions. One was identity sharing and one was idea sharing, which is about, in addition to sharing something about yourself, I wanted to see the sharing something about the content of the course you've signed up for make you more engaged in subsequent videos or assignments. And so what surprised me really is that when we randomly assigned learners into these two versions, and of course we had a control condition with no such nudge, what really seemed to drive their performance and engagement was idea sharing. And this results surprised me because I would have assumed that being asked something about who you are will matter more in these environments because people want to feel connected. And again, just I can go through the specifics of the study. If you like, place place. We had about two thou learners in this experiment. This was a marketing course offered in September and October of two thousand nineteen. So we had two different cohorts and we were able to randomize by creating the same version of the course, which only varied in one aspect, which was the discussion prompt in the very first week of the course. And if you look at the discussion prompt, it's basically a thirty second video of the instructor. So it's a very subtle nudge where the instructor is either saying Hey, welcome to this course, why don't you share something about who you are with your community of pure learners, and there we would have a prompt where learners could actually post their responses right below the video in the idea sharing nudge. On the other hand, we had the after record, a similar thirty second video...

...where the message was, hey, you're signed up to take this marketing course, why don't you share something about your thoughts on how the digital world and digital marketing is shaping firms and consumers? So really that was the only difference between the versions of the course. And learners in each condition were then taking the course as usual. They saw this prompt in the first week. It was optional for them to respond to so again, very very very fundamental nudge idea. We're not forcing them to answer, but they were just invited to share something. And then we subsequently measured during the four weeks of the course, what were the effects of these sharing prompts on their engagement with videos and different assessments and so really what surprised me was that our results showed that asking learners to share ideas led to about thirty more increase in both how many videos they watched in the course and also how many assessments they successfully completed. So encouraging. Can you give us a quick reminder of what that lift actually turned out to be? So that was at lift in terms of number of videos. So if a video, if a course had this course had forty one videos. So we would see them complete videos in the course. It's so exciting and encouraging and just a great reminder of what feels intuitive. You designed a great experiment. You had a prior going into it. You were wrong, but we learned something really, really important that can help not only that course succeed. But as we continue to get on these large online platforms with statsig levels of student populations and we're all running these experiments, it's so encouraging and promising that we might be able to get to better and higher quality online education faster. Maybe leave us with that. Some next steps? Advice for institutions who are listening, who are excited about your research, who want to...

...build the highest quality and most engaging online experiences they can? How can they think about that and how can they think about making sure that they're learning off other institutions curves as well? Eric, that's a great question and a very thought provoking one. Where are we headed in the future? What can we learn from the digital revolution about learner engagement and about the economics of Education More broadly? So at keys our direction has really been focused on the online side of offering high ed with our imb a degree, which is our flagship online MBA program but the future really we're looking towards more adaptive learning, the use of artificial intelligence, personalized learning. How can we learn from each individual learners patterns, personality, behavior, is what works for them and what doesn't, to cultivate a unique learning experience for them and I think that that's the direction of which the field is headed, to include ring focus on micro credential ng working with industries. So we've had a partnership with Google, for example, to be able to offer the skill sets that learners need in the way they are able to get them. So most of our audiences today our mid career executives for these programs. So how can they flexibly design their own learning and get the kind of experiences that will really really help them not only engage with the course but also get to the outcomes that they want in their lives more broadly? So I think that's the direction which you're headed and other schools have done this to you mentioned learning from others, so I know the pounds program has been very successful at Georgia State University and some others, like a s U, are also experimenting with similar AI and data driven programs, and what this shows me is the power of learning from data and and the power of large scale experimentation and AI in creating those unique learner experiences and learner journeys.

You so much for your time today. What's the best place for listeners to reach out if they have any follow up questions, of course, so they can directly contact me on my email, which is unity at Illinois Dot e d U. I'm also fairly active on twitter, so if there are listeners who are on twitter, I would love to connect for them. They're awesome, unity, thanks so much for joining us today. Thank you so much, Eric. Attracting today's new post traditional learners means adopting new enrollment strategies. Helix educations data driven, enterprise wide approach to enrollment growth is uniquely helping colleges and universities thrive in this new education landscape, and Helix has just published the second edition of their enrollment growth playbook with brand new content on how institutions can solve today's most pressing enrollment growth challenges, downloaded today for free at Helix Education Dot Com. Slash playbook. You've been listening to enrollment growth university from Helix Education. To ensure that you never miss an episode, subscribe to the show on Itunes or your favorite podcast player. Thank you so much for listening. Until next time,.

In-Stream Audio Search

NEW

Search across all episodes within this podcast

Episodes (264)