How To Ensure Student Success In Higher Education With AI-Powered Feedback Analytics
![woman holding book](./assets/DhPZiq2pJr/sh-unsplash_3ngnoylnkdk-4096x2733.jpg)
Although the educational world has been focused on artificial intelligence in the past two years, many institutions have yet to recognize its benefits in teaching and learning. Many students are also confused about when, or if, they are allowed to use AI at all.
A recent Inside Higher Ed survey asked students about their understanding of the use of artificial intelligence in their learning. They found that:
- 30% were unsure or didn’t know how to use generative AI to do their coursework
- 16% said they knew how to use AI because of institutional policies on appropriate use cases
Most colleges and universities do not prohibit the use of AI products by students, faculty, or staff, but they frequently have guidelines on how to do so.
For example, in the University of Illinois system, students are instructed that “If you use a generative AI tool in your work, you should cite it. The APA and MLA both provide guidelines for citing generative AI work. Be sure to note how you used it: “I used it this much, in this way, and this percentage of work was generated by AI.”
Cornell University has a similar policy, and adds that students are accountable for ensuring the accuracy of any content that might be AI-generated: “You are accountable for your work, regardless of the tools you use to produce it. When using generative AI tools, always verify the information for errors and biases and exercise caution to avoid copyright infringement.”
Using AI-powered solutions can enhance the student experience by engaging with students both individually and collectively in real time. Educators will gain a unified picture of needs, expectations, skills, knowledge, and competency of students and be able to respond to need areas. The keys are to focus on listening, act on feedback, and create a culture of trust, all of which will drive continuous improvement in teaching and improve the student experience.
That was the focus of a panel entitled “AI & the Student Experience,” hosted by Inside Higher Ed and Explorance on October 10, 2024. Moderating the panel was Sara Custer, editor-in-chief at Inside Higher Ed.
Panelists were Julie Schell, assistant vice provost of academic technology at the University of Texas at Austin; Rob Nelson, executive director of academic technology & planning at the University of Pennsylvania in Philadelphia; and Dan Liddick, business system analyst at Harvard Medical School.
Highlights of the panel discussion follow:
Feedback is a must-have to drive decision-making around student experiences. Formal and informal feedback mechanisms should be part of an institution’s long-term strategy if they want to enhance academic performance and student engagement. How are you and your colleagues currently using AI on your campuses, and what is the response by students?
Julie Schell: Students are not as interested in learning how to use AI as they are in how to use it ethically and responsibly. They can teach themselves how to use generative AI. It’s pretty self-explanatory. So, we developed a framework to help our faculty use AI features, and to communicate acceptable use guidelines with students that includes six key limitations of using generative AI.
Rob Nelson: The biggest challenge for the past year has been around the use of AI to produce student work, and the question of whether AI detection is something we should be doing. I see an emerging consensus that AI detection is not a viable thing to do.
Dan Liddick: The work I do is mainly focused on administrative efficiency in student information systems. Program administrators are really looking for [data that can help] identify students that are struggling so they can help those students.
Although the educational world has been focused on artificial intelligence in the past two years, many institutions have yet to recognize its benefits in teaching and learning. Many students are also confused about when, or if, they are allowed to use AI at all.
A recent Inside Higher Ed survey asked students about their understanding of the use of artificial intelligence in their learning. They found that:
- 30% were unsure or didn’t know how to use generative AI to do their coursework
- 16% said they knew how to use AI because of institutional policies on appropriate use cases
Most colleges and universities do not prohibit the use of AI products by students, faculty, or staff, but they frequently have guidelines on how to do so.
For example, in the University of Illinois system, students are instructed that “If you use a generative AI tool in your work, you should cite it. The APA and MLA both provide guidelines for citing generative AI work. Be sure to note how you used it: “I used it this much, in this way, and this percentage of work was generated by AI.”
Cornell University has a similar policy, and adds that students are accountable for ensuring the accuracy of any content that might be AI-generated: “You are accountable for your work, regardless of the tools you use to produce it. When using generative AI tools, always verify the information for errors and biases and exercise caution to avoid copyright infringement.”
Using AI-powered solutions can enhance the student experience by engaging with students both individually and collectively in real time. Educators will gain a unified picture of needs, expectations, skills, knowledge, and competency of students and be able to respond to need areas. The keys are to focus on listening, act on feedback, and create a culture of trust, all of which will drive continuous improvement in teaching and improve the student experience.
That was the focus of a panel entitled “AI & the Student Experience,” hosted by Inside Higher Ed and Explorance on October 10, 2024. Moderating the panel was Sara Custer, editor-in-chief at Inside Higher Ed.
Panelists were Julie Schell, assistant vice provost of academic technology at the University of Texas at Austin; Rob Nelson, executive director of academic technology & planning at the University of Pennsylvania in Philadelphia; and Dan Liddick, business system analyst at Harvard Medical School.
Highlights of the panel discussion follow:
Feedback is a must-have to drive decision-making around student experiences. Formal and informal feedback mechanisms should be part of an institution’s long-term strategy if they want to enhance academic performance and student engagement. How are you and your colleagues currently using AI on your campuses, and what is the response by students?
Julie Schell: Students are not as interested in learning how to use AI as they are in how to use it ethically and responsibly. They can teach themselves how to use generative AI. It’s pretty self-explanatory. So, we developed a framework to help our faculty use AI features, and to communicate acceptable use guidelines with students that includes six key limitations of using generative AI.
Rob Nelson: The biggest challenge for the past year has been around the use of AI to produce student work, and the question of whether AI detection is something we should be doing. I see an emerging consensus that AI detection is not a viable thing to do.
Dan Liddick: The work I do is mainly focused on administrative efficiency in student information systems. Program administrators are really looking for [data that can help] identify students that are struggling so they can help those students.
Photo by Desola Lanre-Ologun on Unsplash
Photo by Desola Lanre-Ologun on Unsplash
AI has lots of potential to enhance student engagement, retention, and overall growth strategies in higher education. How are you currently using AI to boost student engagement?
Liddick: We have developed an AI ‘sandbox’ that has many large language model applications and image regression tools. The sandbox is available to faculty, staff, and students—anyone who wants to use it. It’s basically a chat prompt, but there is an API available, which enables a user to take data from another system and run it through the chat bot or sandbox.
For more great educational content, access Explorance’s Virtual Summit Recording.
Find out more about Explorance’s
Game-changing AI Solution, MLY.
Even though an organization may promote a certain technology policy this month, its position on the matter could have totally changed in six months. How do you get people used to technical uncertainty?
Nelson: I spend a lot of my time fostering collaborative conversations within and across disciplines. One of the missing conversations is between students and teachers. Students are typically very reluctant to speak openly and honestly about their use of AI tools. We need to figure out where we all are from our different perspectives and share our thoughts and experiences. At the institutional level, we need to move incrementally and carefully to use these tools in such a way that they create student comfort and educational value.
Acting on feedback, as opposed to collecting it and avoiding change, is a surefire way to increase student success and boost graduation rates long-term. How can AI be effectively used to collect and share feedback?
Liddick: Having conversations across campus and having conversations with students is really important. Listening to students using Explorance MLY, the AI-powered solution that collects and analyzes feedback, and hearing what their pain points are can help us give comments to faculty. For example, we’re not saying that students shouldn’t use gen AI in their classes, but we want to explain why it might not be a good idea to use gen AI to achieve a particular learning outcome. If we can articulate why it wouldn’t be a good idea to use the technology to complete a particular assignment or task, then we can foster that climate.
Photo by 愚木混株 cdd20 on Unsplash
Photo by 愚木混株 cdd20 on Unsplash
We often hear students complain that when they have five different classes, they also have five different faculty that each approach AI differently. How can we solve that problem?
Schell: We have an Academic Technology Council with both faculty and student representation. We bring student concerns to this governing body to develop guidelines and best practices for our faculty. Students asked for very specific policies around the use of AI.
Nelson: A lot of our governance structures include student representatives. Most of our deans, especially the deans of OCR undergraduate programs, have come out with guidance that teachers must be clear with students in the early stages of classes, and set expectations around the use of AI. I’m also a big believer in informal mechanisms, such as our centers for teaching excellence. Some of those centers are fostering these kinds of conversations, and they include students and graduate students that are learning to teach.
Liddick: One thing we look for is whether the tool has a chatbot that students can talk to about any issues they may have, such as with the registry or trying to contact faculty. We try to reduce the administrative burden on our administrators, which can be very hard.
How have your policies around the use of technology in general, and AI in particular, changed in the past couple of years?
Schell: We shut off AI detection due to broad consensus that the tools aren’t yet ready for prime time. We are also trying to ensure our policies are really transparent about allowable and acceptable use cases.
Photo by John Schnobrich on Unsplash
Photo by John Schnobrich on Unsplash
The role of AI in education is expected to grow in both importance and adoption. How do you deal with students and faculty that are resistant to using AI, and haven’t embraced the opportunities that this technology can offer?
Schell: We make space for whatever level of readiness people are at. If you’re an AI skeptic, we try to assess what the root of that is. We also stress that we are not just students and faculty, but that we are a community of learners all moving together. We’ve never had engagement that broad before.
Nelson: I think of myself as an AI skeptic. I come to it from a very critical set of questions about what tools provide educational value and which ones don’t. I think that by developing those critical standards and enlisting our faculty in making those assessments is a way to ward off skepticism.
What do you think we gain, and what do we lose, by integrating AI into our educational experiences?
Schell: What students gain from AI is ethical resilience – the ability to take information and use it to make decisions using strong ethical blueprints. I think that is a highly transferable skill and will be needed for the future of work. Also, I’m a bit worried that if we ban the use of these tools, we increase the digital divide.
Nelson: I think it’s really important that students use these tools--not just AI, but any technology--to foster and enhance human connections. We must make sure that connecting with our students and to our colleagues is front and center in how we use them.
Liddick: What we lose is that, when using gen AI in coursework, it disrupts the process where otherwise wrestling with material in assignments leads to a deeper understanding of that material, so the relationship with that knowledge is changed.
It is clear that the use of artificial intelligence tools in education will only increase, and is desired by educators and students alike. The good news is that AI has a critical role to play in increasing student engagement, not just the ability to help students generate reports and complete homework.
A recent study by Inside Higher Ed found that a primary factor in the lack of college engagement by students is lack of information on the programs, groups and resources that are available to them. AI-driven tools can help educators better identify the goals and interests of individual students, and match them with the services that will motivate and reward them.
For more great educational content, access Explorance’s Virtual Summit Recording.
Find out more about Explorance’s
Game-changing AI Solution, MLY.
This custom content is sponsored by Explorance and developed by Inside Higher Ed's sponsored content team. The editorial staff of Inside Higher Ed had no role in its creation.