Launched in late 2022, ChatGPT has been revolutionizing the way we write papers, novels, and news articles. It’s also changing the way we make logos, compose music, tutor, and have conversations.
Essentially, Chat Generative Pre-trained Transformer (ChatGPT) is an artificial intelligence (AI) system that recognizes and reproduces patterns in language with the same syntax a human would use.
While companies developing chatbot AI safe-guard their training methods, Wikipedia is generally considered a reliable source of information in many models.
Ideally, ChatGPT can write documents that look meaningful and sophisticated, but critical thinking is still out of reach and often gives away AI composed papers.
There is growing concern, however, over AI leading to plagiarism, cheating and dissemination of false information. That begs the question, should educators be banning generative text apps like ChatGPT from their classrooms?
Academic integrity expert Sarah Elaine Eaton, Associate professor in the Werklund School of Education at the University of Calgary emphatically says, “No.”
Eaton’s research focuses on understanding how AI can be ethically employed for teaching, learning, and to help Canadian post-secondary institutions adapt to this rapidly changing technology.
Eaton, a featured speaker at the Congress of the Humanities and Social Sciences (Congress 2023), believes, “Contrary to what the fearmongers say, robots are not coming for our jobs, nor will they make us less intelligent.”
She compares current fears over ChatGPT to those raised when radio, television and the Internet were launched.
“Once again, we’re on the cusp of massive change – generative AI is the most creative, disruptive technology in a generation, and it’s up to us as educators to ensure our students are equipped to use it in an ethical way,” Eaton explained.
“With GPT-5 on the horizon, this landscape is changing fast and AI is going to continue to be a hot topic, so now is the time for universities to start planning for the upcoming academic year,” she added.
As principal investigator, Eaton is working with a team of University of Calgary researchers to survey about 500 professors, teaching assistants and graduate students across the globe. The study’s focus is AI and academic integrity.
Participants are given writing samples to establish their ability to distinguish between essays written by humans and those generated by AI. Then, they are asked to grade the papers.
The team will examine the ethical application of generative AI technology for teaching, learning and assessment. Based on preliminary findings, Eaton says that banning generative AI is futile.
“Rather than excluding AI altogether, we need to be having open conversations about it and show a willingness to try these new tools out,” she shared. “We’re heading into a post-plagiarism world where a human-AI hybrid writing model will be the norm, and we need to be prepared for it.”
Eaton’s research includes advocating for student assessment that challenges current norms – like the idea that all students need to write. Instead, she would like to see productive change that better aligns with learning goals and outcomes.
Eaton would like the focus to shift from debates over whether using an AI app constitutes plagiarism, to teaching students how to fact check and edit automatically generated texts while taking full ethical responsibility for their completed work.
“The AI apps themselves don’t have a sense of ethics,” said Eaton. “It’s the human writer/editor who remains responsible for the text. We can relinquish our control over writing, but we can’t relinquish our responsibility for the output.”
Eaton wants to use AI tools to teach critical thinking by getting students to separate fact from fiction. She sees this as an important step in creating future ready graduates.
“If we want to ensure our graduates are ready for the workforce, it’s our responsibility to teach them the ethical use of these apps, because industry is already using them,” maintains Eaton.
Bonnie Stewart, associate professor in the Faculty of Education at University of Windsor, will also be sharing preliminary findings from her research into data privacy in the digital classroom as a featured speaker at Congress 2023.
Stewart says it’s critical for educators to take back control of their classrooms and that means understanding who has control over data collected from AI users.
“The practices of our ‘click yes to terms and conditions’ culture are filtering into our practices with educational tools, where we tend to assume if our institution has approved them, they are fine. But with every click, every keystroke and every deleted search, we’re giving away data that can be used to track and pattern our behaviours,” shared Stewart.
“Our students are swimming in that world, we as educators are swimming in that world, and yet we don’t fully understand the data extraction and privacy implications of that world,” Stewart said.
READ MORE: ChatGPT is your mind on Big Tech
Stewart surveyed 340 educators from 25 countries and found that despite an eagerness to know more about the data processes behind digital learning tools used in classrooms, teachers have been left in the dark. That puts teachers, and their students, at risk.
Based on her findings, Stewart is calling for changes to safeguard digital learning environments and wants educators included in the discussions about data ethics.
“Part of the problem is that institutional digital infrastructures have been ‘cobbled together’ from various corporate platforms, resulting in piecemeal systems for x, y, z,” explained Stewart. “Most educators operating within these infrastructures have no clear understanding of who has control over the data extracted, nor what administration or software vendors are doing with that data.”
Stewart believes new digital and ‘datafied’ systems aren’t transparent. Instructors bringing students onto a platform that an institution has approved doesn’t make clear who at the institution can see students’ logins or level of interaction.
It also doesn’t make clear who at the vendor level has access to video or audio that’s recorded or if student patterns of behaviour can be de-identified and used against them academically or to pitch products or misinformation.
Research showed that by design, information about data privacy remains siloed within information technology (IT) departments. Stewart’s study explored the data literacies and practices of online educators from six countries and found that all participants were aware of the potential for very real data privacy issues for students and wanted to be involved in the conversation.
“Educators see an urgent need for change and to start having real conversations about data ethics, because they want to manage their classrooms with a ‘first do no harm’ approach,” explained Stewart.
Without big picture oversight, control over data is falling into the hands of the corporate world behind education technology, including rapidly emerging AI tools like ChatGPT.
Governance policies are lacking input from educators and when they do exist, they aren’t communicated well and there is no simple way to keep them up to date.
“We need to be demanding of our institutions, and quite frankly our institutions need to be demanding of our sector, that we gain more clarity, more pedagogically oriented, clear plain language in our terms of service,” said Stewart. “Maybe it’s time to demand of our vendors that we as a sector aren’t going to sign their contracts unless we see this moving in a less surveilling direction, and we understand what’s happening with our students’ data and we’re cool with that.”
Billed as a leading conference on the critical conversations of our time, Congress 2023 serves as a platform for the unveiling of thousands of research papers and presentations from social sciences and humanities experts worldwide.
Over 8,000 scholars, graduate students and practitioners will be participating in events focused on reckoning with the past and reimagining the future. The goal is to inspire ideas, create dialogue and move participants to action that leads to a more diverse, sustainable, democratic, and just society.
Scholars, artists, and activists will come together May 27 to June 2 at York University in Toronto to share in a mix of cultural and academic programs.
The experiences, knowledges and cultures of Indigenous and Black communities will be central to the conversation. Members of the public can register for a community pass to access events open to the public for $55.