By Steven Zhang.
ChatGPT presents two complementary ways in which it can provide assistance to students in an academic setting: the ability to hasten the mundane and nonproductive aspects of the educational process, and to augment the aspects of curriculum that require the student to engage in productive thought. Of the former, the implications of ChatGPT can be seen as an extension of existing technological tools such as search engines. Search engines, upon their introduction, provided students with a convenient alternative to libraries and an alternative resource for when teachers are not available. Professor Marc Watkins at the University of Mississippi writes that AI programs such as ChatGPT can provide “a quick summary of articles using a research question as a prompt, making it easier for students to find relevant information from a database of open-access journal articles. It’s like JSTOR on 1980s box-office Schwarzenegger steroids” (Watkins). A survey conducted by The Learning Network on teacher opinions over the use of ChatGPT likewise concluded that, “when used on homework, something usually meant for learning and practice, it can allow a student to more clearly grasp the subject” (The New York Times). The survey later concludes that, “If a student needs to look up an answer anyway, is it not far better to have a more convenient option that also very clearly explains the concept?” (The New York Times). In this case, the results are identical, regardless of whether ChatGPT is involved in the process. By extension, it is valid to argue that the use of ChatGPT is no different from the use of search engines, or the use of textbooks before that, as it is only providing a more convenient way of completing tasks that achieve the same end result.
Yet beyond only expediting mundane tasks, ChatGPT has shown the potential to assist students in subjects that require critical thinking and reasoning. Professor Marc Watkins at the University of Mississippi, after experimenting in a literature class with a similar program called Elicit, wrote that “One student reflected on how Elicit helped them expand their knowledge… Another student shared that Elicit provided them with a ‘broader perspective about what details [they] should write about’”. This is later followed by the statement that “language models are just math and massive processing power, without any real cognition or meaning behind their text generation” (Watkins). Therefore students using ChatGPT must still produce their ideas, as those cannot be spontaneously generated by an AI. Yet through using AI as a tool, students can explore corollaries and counterpoints that will force them to respond with additional ideas and counterarguments, resulting in more critical thinking and reasoning, not less.
Scientific fields are a critical field where ChatGPT has shown the potential to greatly augment the thinking done by students. While the focus of scientific fields is on objective data and analyses rather than arguments and opinions, several ways of incorporating increasingly sophisticated tools such as ChatGPT into research have already been identified by the scientific community–ways that can be adapted for instructional purposes as well. Daniel Novak, professor at the University of California–Riverside, advances the notion that “ChatGPT can help students quickly and easily access a vast amount of information” and “assist students in analyzing large amounts of data and identifying patterns and trends”, through which it can “save [students] time and effort in their research” (Pittalwala). The methods proposed by Novak are not new, but rather, are only an extension of tools frequently used in professional academia. The difference is that in research done at the postgraduate level, there is rarely any stigma against using the extent of tools available, while for educational purposes, often arbitrary and ambiguous limits are placed on what constitutes an “acceptable” form of technological assistance.
However, it is important to make the distinction that research at the postgraduate level cannot be equated with education at the secondary or undergraduate level, and therefore, the act of placing certain limitations on the use of technological assistance in the latter is not entirely unjustified. To synonymize the act of permitting the limited use of ChatGPT to the permitting of its unrestricted use in all situations is to exaggerate a grossly misrepresented argument that was neither proposed nor implied. For one, as suggested by Professor David Kiping at Columbia University, it is reasonable to assume that AI programs such as ChatGPT will have the potential to dramatically outperform students in examinations at the secondary or undergraduate level in the future (Kiping). As these are often introductory-level courses, students are expected to have a thorough understanding of the topics involved, so as to provide a solid foundation for future courses. In this case, it is perhaps not unreasonable to argue for a total ban of not only ChatGPT, but of all kinds of assistance, including reference materials and notes, in examinations. Therefore, in placing such distinctions between what constitutes a justifiable use of AI assistance, it is necessary to define certain guiding principles that determine how each case is decided on an individual basis.
This distinction, however, is a natural extension of the claim that the role of ChatGPT is to facilitate the student to engage in more productive aspects of the curriculum. In deciding when to allow the use of ChatGPT, it is the responsibility of educators to determine whether its use will result in an expansion or diminution of the aspects of the schoolwork that hold educational value. Daniel Novak, professor at the University of California–Riverside, summarized this point by writing, “ChatGPT can help students understand and retain complex information by summarizing it in a more accessible format” (Pittalwala). In this case, the use of ChatGPT holds clear productive value as it provides students with a more robust way of retaining information. Similarly, Ward Beyermann, professor and colleague of Novak, argues that “If students focus on advanced aspects of their schoolwork, chatbots could assist with more menial educational tasks” (Pittalwala). While broader in scope, Beyermann’s argument enforces the notion that ChatGPT can be used to increase the amount of productive work done by decreasing the amount of time spent on the aspects of schoolwork that provide no educational benefit. However, it is equally important to acknowledge the situations where the opposite is true–where the permitting of ChatGPT results in a reduction of the aspects of schoolwork that provide educational value. Daniel Herman, a high-school English teacher, expresses one such concern in an article by writing, “But if most contemporary writing pedagogy is necessarily focused on helping students master the basics, what happens when a computer can do it for us?” (Herman). Indeed, it is unrealistic to assume that there exists no situation where AI tools can substitute for the role of the student. This is representative of a broader concern expressed by educators, that while the intention might be for ChatGPT to be used in honest and productive ways, the potential for misuse cannot be disregarded. In this sense, the role of the educator is critical in determining how students are to be tested such that the practical extent of their knowledge can be utilized, whether that is through permitting the use of AI programs such as ChatGPT, restructuring aspects of the curriculum so as to accommodate for their limited use, or prohibiting their use altogether in situations where students are expected to demonstrate their proficiency in the absence of outside assistance.
Works Consulted
D’Agostino, Susan. “Designing Assignments in the ChatGPT Era.” Inside Higher Ed, Inside Higher Ed, https://www.insidehighered.com/news/2023/01/31/chatgpt-sparks-debate-how-design-student-assignments-now.
Herman, Daniel. “The End of High-School English.” The Atlantic, Atlantic Media Company, 16 Dec. 2022, https://www.theatlantic.com/technology/archive/2022/12/openai-chatgpt-writing-high-school-english-essay/672412/.
Kiping, David. “CHATGPT Takes a College Level Astrophysics Exam.” YouTube, YouTube, 7 Jan. 2023, https://www.youtube.com/watch?v=K0cmmKPklp4.
Pittalwala, Iqbal. “Is Chatgpt a Threat to Education?” UCR Magazine, University of California, Riverside, 25 Jan. 2023, https://news.ucr.edu/articles/2023/01/24/chatgpt-threat-education.
Roose, Kevin. “Don’t Ban ChatGPT in Schools. Teach with It.” The New York Times, The New York Times, 12 Jan. 2023, https://www.nytimes.com/2023/01/12/technology/chatgpt-schools-teachers.html.
Watkins, Marc. “Guest Post: Ai Will Augment, Not Replace.” Guest Post: AI Will Augment, Not Replace, Inside Higher Ed, 14 Dec. 2022, https://www.insidehighered.com/blogs/just-visiting/guest-post-ai-will-augment-not-replace.
“What Students Are Saying about Chatgpt.” The Learning Network, The New York Times, 2 Feb. 2023, https://www.nytimes.com/2023/02/02/learning/students-chatgpt.html.


Leave a comment