COVID-19 took the world of higher education by storm. Colleges and universities had to rapidly respond to the demands of students and instructors given the new norm. And now, just when it looks like the world is catching its balance after a global health-care crisis, it seems that the new scare is artificial intelligence.
With the emergence of technologies like ChatGPT, there’s been increased conversation about how educational systems will look moving forward and the potential impact AI will have on teaching and learning. There are a number of lingering questions to discuss here: What AI tools are currently being used? Will cheating increase? How can institutions respond to AI developments? And how can we ensure that students are still engaged in learning?
Current AI Tools Used Now in Academia
Contrary to some of the social-media outrage regarding new developments in AI technology, students and instructors are using several AI-generated tools at institutions of higher education. One of the most common AI-writing tools is Grammarly. The purpose of Grammarly is to offer spelling, grammar and structural support for users to improve their writing. The tool continues to evolve to find ways to best support its users.
Some higher-education institutions have adopted Grammarly as an institution-wide tool for faculty and students. It must be noted that because it is an AI-writing tool, programs like TurnItIn have the potential to flag assignments that were edited through Grammarly for AI-generated content, even if the piece is an original composition. TurnItIn is a plagiarism-detection software with new developments geared toward detecting AI-generated text in student work. The company boasts a 98% success rate with support for instructors and Learning Management System, or LMS, integrations.
There are special populations that should be considered when thinking of AI tool usage. Students with disabilities and English Language Learners often rely on AI to ensure they receive the best education possible. Image and facial recognition tools assist students with visual impairments while text tools like QuillBot help students who need support with summarizing or understanding written content. No singular tool or software meets the needs of all learners, so it’s important to consider special populations when taking a stance on AI usage in academic settings and what constitutes academic integrity.
Will Cheating Increase?
To be quite honest, cheating may increase due to AI. However, reports indicate an indifference among college students and their decision to use the AI tools that are available to them. Academic integrity is a valid concern for colleges and universities, so time is of the essence to revamp the definition of “cheating” or academic dishonesty with the new forms of tools being made available. Institutional leadership can take the hyper-emergence of AI as a call to action to meet with faculty and staff to develop policies and procedures around ethical and acceptable use of emerging tools.
Some of the tools, like Grammarly, can be seen as supplemental academic support tools that help students improve original written work. Others tools, like ChatGPT, present a unique situation in terms of the ethical use of AI. Since ChatGPT can formulate entire written pieces of work and pull information from a catalog of sources, instructors may have to redirect students’ research skills, reteach the concept of primary and secondary sources, and encourage students to vet the references they find for validity, bias, and reliability.
How Institutions Can Respond to AI Developments
Institutions need to consider two populations when moving forward in regard to AI developments. First, institutions should consider how the developments will affect learning (students). Then, institutions will need to work with faculty to determine the impact AI will have on teaching (faculty and staff).
Yale University developed an AI resource page for faculty and staff to support employees’ understanding and adaptation to AI integrations in higher education. If institutions are uncertain about their next move regarding AI, this may be a good starting point for reference. Eventually, decisions will have to be made, and they’re better made sooner than later.
AI is already affecting how students learn and access educational materials. So, the primary step is to determine what constitutes ethical and acceptable use, then communicate that messaging to students consistently. Special consideration should go to students with learning needs and those in asynchronous settings, as they may interact with content more than institution personnel, so this messaging should be embedded on resource pages, welcome materials and course syllabi.
Whether instructors are for or against using AI for teaching and learning, they need to be able to identify AI tools and how they can affect teaching and learning. This is where professional development and learning opportunities come into play. If institutions want instructors to be able to make sound decisions about the potential woes of academic integrity, they must provide faculty with reliable resources.
Ensuring Students Are Still Engaged in Learning
Some instructors may fear that students’ engagement with course content will dwindle now that certain AI tools require less input from users. George Washington University’s Office of the Provost published a guideline document with recommendations for teaching and learning practices that involve student engagement with course content.
Instructors can take two approaches:
Integrate AI tools in teaching and learning. To be able to identify the possible use of AI tools or to promote the ethical use of those that are available, instructors must develop a clear AI integration plan with policies, procedures and non-negotiables included.
Redesign teaching and learning materials to protect courses from heavy AI infiltration. This may require courses to include more real-world prompts and application topics that empower students to produce work that reflects the environment around them. Both methods may consume instructor time, but at the growing rate of AI, they will either spend time preparing for the mess or cleaning it up.
We’ve been here before. We watched the internet develop from dial-up to high-speed. Cell phones were once used for brief calls, T-9 text and the “snake” game, and they’ve transformed into 4-inch wifi-powered computers.
We’ve already witnessed technology advance and transform how we teach and learn. AI is here to stay. It’s time to shape it.
This MFP Voices essay does not necessarily represent the views of the Mississippi Free Press, its staff or board members. To submit an opinion for the MFP Voices section, send up to 1,200 words and sources fact-checking the included information to [email protected]. We welcome a wide variety of viewpoints.