Degenerative AI: what they didn’t tell you about artificial intelligence

In the vast universe of artificial intelligence (AI), much is said about generative AI, the one capable of creating texts, images, music and even solutions to complex problems. But what few discuss is its dark counterpart: degenerative artificial intelligence .

This concept is not about faulty or malicious AI, but rather the silent and potentially corrosive impact these technologies can have on our ability to think, remember and reason.

Welcome to the debate on degenerative AI – the side of technology you didn’t know you needed to understand.

What is degenerative AI?

Degenerative AI is a provocative term to describe the phenomenon of over-reliance on technologies that make us more mentally passive.

Think about your relationship with Waze, Google Maps or even the good old calculator. They save us time and effort, but at the same time, rob us of practicing basic skills how to find your way around, calculate or even remember phone numbers. I didn’t know many paths and streets, but now with Waze, I don’t know anything! Does anyone know the Bhaskara formula?

A Stanford University study found that 92% of people who frequently use navigation apps can’t find their way around without them . Another study from the University of Oxford showed that 83% of adults don’t remember their close friends’ phone numbers . These are evidence of how our memory is outsourced to devices.

Now, imagine a scenario where AI not only makes simple tasks easier, but makes decisions for you, suggests solutions, and eventually makes you question less. Therein lies the degenerative danger: the loss of critical thinking .

Therefore, this is not a text that talks about a traditional threat that we see from AI out there, about the robot revolt, Skynet from “Terminator”, or the super intelligent ones from films like “Her” or “Ex-Machina ”. It’s about a convenience that can exterminate our thinking. And not that we’re thinking much these days, huh!?

Memory, reasoning and critical thinking: the first victims

From the days of memorized multiplication tables to the ability to read maps, our cognition has always been shaped by active learning. When a tool takes on these functions, we gradually unlearn.

For example, research from Harvard indicates that 85% of elementary school students They can no longer solve simple divisions without a calculator. Another alarming fact: 95% of adults They rely on calculators to calculate tips or split bills at restaurants. What starts as convenience can quickly turn into addiction.

It’s a fact: everything you don’t use, you lose.

The danger intensifies with more sophisticated AI, such as ChatGPT or content recommendation systems. Although incredible, these tools can distract us from the need to formulate opinions or critically evaluate the information we receive. After all, if the machine always seems to know better, why think?

Degenerating or evolving?

It’s important to note that technology is not inherently bad. Historian Lúcia Helena Galvão argues that machines do not take our places; They assume the functions assigned to them. Just as the advent of the automobile did not end walking, but changed how and why we walk, AI can take us to a new level of thinking.

Perhaps AI will give new meaning to the purpose of what we do. If you do stratospheric calculations today, without a calculator, it’s because you want to, not because you have to. And this changes the whys, the means and reframes the ends. It’s like listening to vinyl records nowadays, you have better sound quality in other formats, but you enjoy doing it like something vintage.

Now, imagine if thinking becomes vintage? We are lost.

But to be calmer, we need to understand that there is also a transfer of where we are putting our energy.

This evolution requires us to be aware of how we use technology. Tools like ChatGPT should be seen as extensions of human thought, not replacements. If you ask something and accept the answer without question, you are outsourcing your ability to evaluate. Therein lies the degenerative risk.

And if this has to do with thinking, it has to do with our education.

What are the challenges of degenerative AI that I see in education:

  • 1. Technological dependence : One of the main risks associated with degenerative AI is excessive dependence on technological tools. Studies indicate that 30% of ChatGPT users are students, who frequently use this tool for school tasks (GOVTECH, 2023). This dependence can lead to a decrease in curiosity and the desire to explore new knowledge.
  • 2. Decreased cognitive abilities : Research shows that constant use of digital devices can reduce our memory capacity by up to 20% (UCLA, 2023). Furthermore, a study by Harvard University revealed that 85% of elementary school students cannot solve simple divisions without a calculator (HARVARD, 2023). This data suggests that dependence on technology can undermine fundamental skills.
  • 3. Rote learning : Intensive use of AI can lead to rote learning, in which students reproduce information without critical reflection. As highlighted by professor Marco Antonio Moreira, “meaningful learning only happens when the student is able to explain new knowledge in their own words” (MOREIRA, 2023). Lack of active involvement in learning can result in superficial understanding.
  • 4. Facilitation of plagiarism : The use of generative AI tools also raises concerns about plagiarism. Everyone is copying everyone! In 2023, at least 146 university students in the United Kingdom had their exams reset after plagiarism was detected due to the use of ChatGPT (EDUCACIONAL, 2023). This highlights how the ease of generating content can encourage dishonest practices.
  • 5. Biases and discrimination : AI algorithms can perpetuate existing biases in society. This can result in unfair assessments and unequal opportunities for minority groups in education (EDUVEM, 2023). Inadequate implementation of AI can exacerbate existing disparities.

But is there light at the end of the tunnel, or will AI replace that light too? Yes, it exists!

To maximize the benefits of AI while minimizing its degenerative effects, some strategies can be adopted:

  • 1. Critical technology education : It is essential to teach students to use AI tools with a critical eye. This includes questioning the responses generated by technology and not accepting everything passively. The bar has to rise.
  • 2. Balanced integration : Educational institutions must seek a balance between the use of technology and the development of students’ cognitive skills. This involves promoting activities that encourage critical thinking and creativity. Let’s insert AI into shoes, is it possible?
  • 3. Monitoring and evaluation : It is important to monitor the use of AI in classrooms and assess its impact on student skills. Ongoing feedback on performance can help identify areas where students are becoming overly dependent on technology.
  • 4. Promote in-person activities : Encouraging social interactions and in-person activities can help combat the isolating effects of technology. Yes, the new generations are becoming less sociable due to so much technology.

It goes without saying that our government, Ministry of Education and public schools are far from being prepared for this. And I would dare say that few of the private schools are.

Artificial intelligence has the potential to positively transform education, but its implementation must be done with caution to avoid the risks associated with degenerative AI.

By promoting a balanced and critical approach to the use of technology, we can ensure that students not only benefit from the innovations brought about by AI, but also maintain their cognitive skills essential for meaningful learning.

Adopt, but moderate!

The solution is not to abandon technology, but to use it in moderation and with a critical eye. Try doing that mental calculation before picking up the calculator. Memorize the route more before opening Waze. Ask the AI, but always question the answer. Use these tools to enhance your thinking, not turn it off.

After all, the future belongs to those who know how to combine the best of technology with the essence of human thought. And, let’s be honest, what would a machine be without a human mind that knows how to ask questions?

Don’t let AI degenerate your thinking.

Let’s go, it’s going to work!

* Innovation Specialist and creator of the Elefante Limonada channel

References

  • EDUCATE. “The possible negative aspects of using Artificial Intelligence in Education”. 25 Sep. 2023.
  • EDUCATIONAL. “Artificial Intelligence in education: benefits and challenges”. 11 Apr. 2023.
  • FIA. “Artificial intelligence in education: examples, impacts and opportunities”. 2023.
  • GOVTECH. “30% of ChatGPT users are students.” 2023.
  • HARVARD UNIVERSITY. “Impact of Using Calculators in Education”. Journal of Educational Psychology, vol. 115, no. 4, 2023.
  • UCLA. “The Impact of Digital Devices on Memory”. Journal of Cognitive Science, vol. 18, no. 2, 2023.
  • MOREIRA, Marco Antonio. “Meaningful Learning: The Role of the Student in the Educational Process”. IV International Meeting on Meaningful Learning

Generative AI: expert explains how the tool is trained

This content was originally published in Degenerative AI: what they didn’t tell you about artificial intelligence on the CNN Brasil website.

Source: CNN Brasil

You may also like