Open-AI in the classroom: Ban or Embrace?
As a professor in the data analytics/science and quantitative finance space, I’ve always been on the lookout for new ways to enhance my students’ learning experience. In 2018 I started testing and integrating adaptive and personalized learning approaches, collaborative applied projects, gamification, and finally blended learning technology when COVID started. All of them with successful outcomes, but I expect and prepare for the greatest impact to come from GPTs in 2023. It is hard not to be intrigued by the possibilities of using a Generative language model in the classroom. Since OpenAI released ChatGPT last November, it has been making waves in the data science, marketing, and certainly in the academic world as a powerful tool capable of generating human-like text responses. The question is now: Is ChatGPT the way of the future, or a threat to traditional classroom learning?
The ChatGPT 3.5 model (Generative Pre-trained Transformer) has been in the works and trained by OpenAI since 2015. When the AI-powered chatbot was made available to the public via OpenAI’s website, it quickly became a viral sensation with over a million users within six days of its launch. Users from all over the globe posted inquiries and tweeted on social media platforms and actually received human-like responses from ChatGPT. While this new natural language processor (NLP) is still in the research review phase, many users (incl. students) have been testing its ability to write seemingly anything, generate (almost) functional scripts in most coding languages, or have compelling conversations with… well … artificial intelligence (see Silicon Valley episode). All of this free of charge (for now at least).
As the impact of ChatGPT on the educational system is being evaluated, several universities have responded to the potential threat by taking preventive measures like blocking access on school computers or networks, adapting their curriculum, or redesigning entire courses. Most of the debates in education have since revolved around the cheating potential ChatGPT poses. I have heard of colleagues increasing the weight of oral exams or group work, some of them even going back to handwritten assessments. As educators worry about the havoc that ChatGPT could wreak on our lesson plans, the field is eagerly awaiting OpenAI to include features such as hiding cryptographic to make ChatGPT-generated content more easily recognizable and tools like ZeroGPT to detect and prevent students from using it to cheat. Writing essays, solving coding assignments or math problems, or simply answering all their multiple-choice questions — all this power at the fingertips of our future generation. If you think Chegg was a threat to standardized testing, check back in another few weeks.
Besides the academic integrity concerns, I particularly share the concerns regarding the negative impact on students’ critical thinking and communication abilities.
The value of traditional classroom learning could be at risk if personal engagement levels are diminished by students who complete assignments with the help of a bot rather than that of their peers or educators. There are also legitimate questions about the ethics of AI-generated writing, and reasonable concerns about whether the answers ChatGPT provides are accurate. Even the almighty bot gets the math or data science problems wrong sometimes, puts the user at risk of introducing fatal biases or incorrect references, and is certainly not accountable for its actions (or words). The chatbot often provides very convincing and well-written answers that might be ultimately and factually incorrect- which makes it so crucial to have open and transparent communication with our students about the limitations and potential risks of using AI tools.
There are still several tools we can utilize as educators that are challenging for a bot to replicate, such as promoting collaborative learning and classroom discussions, encouraging students to demonstrate their learning through creative methods (sorry but AI has PowerPoint covered, see Tome), or implementing projects that mirror real-world issues and require critical thinking abilities. From my personal experience, using these strategies are quite effective in motivating students, helping them discover their own voice, and becoming more self-directed and invested in their learning progress.
So is ChatGPT just a tool for cheating? I don’t think so. In my opinion, there is a lot of potential in using ChatGPT as a “copilot” in the classroom, for both students and educators. When GitHub launched its AI-powered code completion tool Copilot in 2021, the community was (mostly) thrilled as it helped us write code more quickly and efficiently. You just provide a rough draft of your code or comment lines and the ML models complete lines of code for you, provide real-time feedback, reduce errors, and suggest best practices. Similarly, ChatGPT has the potential to significantly improve student engagement and learning outcomes. I have provided a few options below, but please note that this list is not exhaustive and I welcome any feedback and insights:
- AI-powered tutors or personalized learning modules can significantly enhance the personalized learning experience. Furthermore, ChatGPT can assist in the evaluation process by giving immediate feedback and identifying areas for improvement for students, enabling them to identify and rectify errors in their writing.
- Higher Productivity: AI can save time for students and instructors by quickly producing lecture notes, which are often more effective than rewatching recorded Zoom lectures. Certain tasks can even be automated (e.g. writing summaries) which can free up more time for students and educators to focus on creative work.
- Educators can provide students with a variety of examples that are more relatable to them, rather than sticking to topics the educator is most comfortable with (I’m sure my student are already tired of my car and financial market examples), while still achieving the same learning objectives.
- ChatGPT can support students in expressing themselves more effectively, whether they have difficulty putting their thoughts into written words or are in a learning environment that is not their first language.
- AI can also be used to enhance the learning experience by generating unique and creative writing prompts that will challenge students to think more deeply about a topic. Critical thinking!
- I also view it as a useful supplement to other resources, such as Google, as it can offer different examples and viewpoints. It also helps in managing the excessive amount of information we are all facing.
As the Spring semester approaches in a few weeks, as a data science professor, I am faced with the question: What’s my stance on using ChatGPT in the classroom? As educators, it is crucial to consider the impact of AI on the educational experience of our students and the world they will inherit. I’ve decided to embrace the use of AI in the classroom, but only in specific areas and with proper precautions in place. I do see the potential benefits of using ChatGPT, such as personalized learning and enhanced engagement, but I also understand the concerns about cheating and diminishing the value of traditional classroom learning. ChatGPT is able to provide quick and easy answers to questions, but it does not build critical thinking and problem-solving skills, which are essential for academic and lifelong success. I believe that the use of ChatGPT should be limited and used in conjunction with other tools and methods to prevent cheating and promote critical thinking.
Ultimately, AI technology is not going anywhere and its effects will be widespread in academia and many other fields. Standardized tests, a fundamental component of our educational system, may soon be answered by ChatGPT (it almost passed the bar exam!), which forces us to remain vigilant while also being open to new possibilities and adapting our approach as necessary. ChatGPT is not infallible (yet?) and it is the responsibility of us educators to plan a course of action for using generative AI in a way that is ethical, fair, and accountable.