AI Chatbots present a threat to the culture of education

ChatGPT functions like a virtual assistant.

ChatGPT functions like a virtual assistant.

Published Feb 19, 2023

Share

“I’m not able to complete your request”, read the reply. I had asked the chatbot, ChatGPT, for a list of resources from which it constructed its answers.

This after I posted a series of questions of increasing complexity for it to answer. ChatGPT is the artificial intelligence programme which is widely debated for its threat to the integrity of education and learning.

It’s responses to my questions were of high quality, meaning that it included all the main points that one would expect from someone with a fair level of knowledge of the subject.

While the answers did not represent extraordinary reflections of the most complex nuances and current research findings on the issue, they would surely have passed a final year exam for an undergraduate degree with a C, if not a B-grade. It is on this point that all criticism against chatbots turn, namely that it will destroy academic integrity by making it possible for students to not do their own work.

Students could simply submit what was prepared by a bot. The problem is that a student who does so creates two serious crises.

On the one hand, the student does not learn and gain the knowledge a candidate needs to step into a professional role and the world of work – a failure of education to make a sustainable contribution to building a healthy economy and polity.

On the other hand, a student compromises own moral integrity, as much as undercuts the all important foundation of academic integrity, namely, to do your own work and give credit to the work of others – a failure of education to prepare ethically conscious and morally sound citizens.

When these two possibilities become a reality, a third crisis will emerge, namely to reveal that education systems as a whole have become outdated, and in its entirety must go. This is the triple threat to learning that Chatbots seem to represent: learners not learning, citizens who cheat, and schools and campuses becoming a farce where existing knowledge are regurgitated. These threats are, however, not new.

Even if representing a most refined version of such cases, to have a chatbot gather information for you is no different from a student who gathers ideas and content from tutors, libraries, learning platforms, and the multitude of knowledge resources available on the internet.

Educators encourage students to explore a subject or question as broadly as possible, including to talk to specialists, read widely and download from as many resources as possible, and then put it on paper. Knowing it is impossible for a student to give due recognition of all resources by, for instance, including references and citations, assessments require only sufficient levels of original work.

In such an environment chatbots offers merely another option and resource for a student, even if it functions as a specialist assistant to quickly summarise the most important facts on any question. This is no different from the existing reality that information on any subject is already available in summarised format at the press of a button.

So, if it is the case that the challenge AI and chatbots put to education is not new, what then is the ghost in the machine that frightens us? A useful way to explore this question is to consider that the deep purpose of education is for people to become more able, more conscious, more connected.

Put more precisely, to become more human. Is it the possibility of surrendering the initiative and agency to author for ourselves the storyline of what it means to be human that frightens us? Or even, at its worst, is it the face in the mirror that chatbots hold up and have become, which reveals how much we are willing to compromise in our struggle for humanity that scares us so? Both, I think.

* Dr. Rudi Buys.

** The views expressed here are not necessarily those of Independent Media.

Cape Argus

Do you have something on your mind; or want to comment on the big stories of the day? We would love to hear from you. Please send your letters to [email protected].

All letters to be considered for publication, must contain full names, addresses and contact details (not for publication).