Ƶ

Writing and learning at stake in the age of AI

Writing and learning at stake in the age of AI

Writing and learning at stake in the age of AI
Image: Shutterstock
Short Url

I have been having increasing trouble explaining to my colleagues in the newsroom that copying and pasting from the internet or any widely available tools that are based on large language models, such as ChatGPT, is counterproductive for journalists, including for their knowledge and professional development. But to no avail.
What I have been experiencing as a news editor is not unique or new, as research has been pointing to the extensive use of writing tools at educational and business levels. In the long run, this could undermine crucial human faculties such as critical thinking and information retention.
It is important for the state, society and all learning and content-producing institutions to sound the alarm that continued overreliance on machines will be damaging to humans and society. We should heed the calls of experts who say that many people are unaware that relying only on machine tools to complete tasks will, over time, undermine professional progress and the development of skills. This prevents people becoming better journalists, researchers or academic achievers. It impedes journalists’ ability to produce cutting-edge reports and inform society through thoughts and reflections based on new insights. We cannot rely on re-representing what is already available and is just aggregated by the machine again and again.
Massachusetts Institute of Technology research, conducted with the participation of 54 students, revealed some alarming findings. The students were split into three groups and were each assigned the task of writing an essay in 20 minutes. One group used ChatGPT, the second a search engine and the last group had only their brainpower.
The researchers measured the brain activity of the students during the exercise, while two teachers later marked the essays. The ChatGPT group scored significantly worse than the independent group on all levels. The scientific measures showed that the different areas of their brains connected to each other less often during the exercise. And more than 80 percent of the ChatGPT-aided group failed to quote anything from the essays they had written when asked about its details afterward.
The ChatGPT group appeared mostly to focus on copying and pasting and the teachers could easily spot their “soulless” essays because of their good grammar and structure, despite their lack of creativity, insight and personal touch. 

It is important to sound the alarm that continued overreliance on machines will be damaging to humans and society.

Mohamed Chebaro

The results of this preprint study, which is yet to be peer reviewed, resonated with educators and academics, who are grappling with the impact of digital tools on knowledge and education. The team behind the study received more than 3,000 emails from frustrated teachers.
Lead author Nataliya Kosmyna emphasized during media interviews that the study proves the need for more research into how artificial intelligence tools are impacting learning and how people can use them more carefully to help the process of learning. However, many from the academic and publishing worlds are already noticing the damaging impact of AI tools being misused by students and writers alike. They make people lazier and less wired into problem-solving or using their initiative. They are also less likely to take ownership and responsibility for what they produce.
This research was not the first to identify a strong correlation between the misuse of AI tools and the diminishment of skills such as critical thinking. Younger users in particular are exhibiting greater dependence on AI tools and are consequently seeing lower cognitive scores. These systems’ delivery on demand when triggered with the right keywords also decreases engagement in deep analytical processes, the study found.
It is a no-brainer, therefore, that a looming shift such as the study suggests — from active information-seeking to passive consumption of AI-generated content — could be extremely dangerous for how current and future generations process, evaluate, curate and store knowledge.
If not addressed properly, this could have repercussions beyond academic and content-creation settings. Scientists say it could create broader cognitive development problems, potentially affecting our intellectual development and autonomy as a whole.
Tools based on large language models are undoubtedly in the process of revolutionizing our world, especially our educational institutions. With time though, and as the machine excels at generating texts similar to what humans have written or might write in the future — even mimicking genuine conversations — it is feared that humankind might find itself at a loss. If the connection to original sources gets mixed up with fake or fabricated versions, to the point that biases and inaccuracies get built into the information that is churned out on demand, then fact-checking will not only be necessary but also harder to achieve.
In an age where our lives are increasingly controlled by tools for our comfort, one must be wary of times to come, when too much comfort ceases to be bliss. Let us hope that a tyrannical age of the machine remains a dystopian fiction.

Mohamed Chebaro is a British-Lebanese journalist with more than 25 years’ experience covering war, terrorism, defense, current affairs and diplomacy.

Disclaimer: Views expressed by writers in this section are their own and do not necessarily reflect Arab News' point of view