Technology24.03.2025

Artificial Intelligence is here to stay and universities should embrace it

Artificial Intelligence (AI) has been a significant disruptor. It automates tasks, enhances decision-making, and fosters innovation. It has led to new working methods and reshaped industries in just a few years.

As such, there is a large debate in academia about the use of AI, centring on its potential benefits and risks, ethical implications, academic integrity, and the role of it in research and education. 

Dean of the Faculty of Engineering, Built Environment and Information Technology (EBIT) at the University of Pretoria (UP), Prof Wynand Steyn, told MyBroadband that it has been a significant disruptor in the field.

He explained that the standard engineering program spans four years, meaning that students set to graduate in 2025 began their studies in 2022.

At that time, generative AI tools like ChatGPT were unavailable to consumers. Even those in the field of computer science recognised that while AI advancements were on the horizon, the speed of their development was unexpected.

In November 2022, ChatGPT was introduced and quickly gained widespread adoption. As these students approach graduation, AI has become an integral part of everyday life.

As AI is taking the world by storm, many institutions have been looking at ways of preventing student, lecturer, and researcher use of AI, choosing prohibition over adaptation.

However, UP’s EBIT embraces the tool. “I’m quite honest and open about it. We do incorporate it. It’s not a sin to use generative AI because it’s a normal tool out there,” said Steyn.

“If we tell our students, no, you cannot use it at all blanket full stop, we would be doing a major disservice to them because the industry would expect that knowledge.”

UP EBIT Dean, Professor Wynand Steyn. Photo: Seth Thorne

AI incorporated into syllabus

Steyn said that artificial intelligence is not just used as a tool to aid work but is also being incorporated into assignments.

Some assignments, for example, require students to use ChatGPT or other generative AI tools to write a two-page paper about a particular topic.

The part they are ultimately assessed on is critically linking that prompt to the curriculum done in the past four weeks.

“Critique that document and tell us why aspects that are in there are either good or bad or correct or how can it be improved,” explained Steyn.

However, the Dean stresses that AI is not a complete solution and requires human understanding.

“It’s quite important in the end to keep on realising that at least at the level that it’s at currently, it does not solve all problems.”

“It does not respond to all your questions correctly, and it responds in a specific way based on how it’s taught. If you need to develop new solutions, it starts to become slightly problematic,” said Steyn.

As such, there are guidelines for both staff and students that are undergoing constant development, outlining how AI can be used ethically and where its use should be restricted.

Steyn provides an example to illustrate this. While AI may be entirely acceptable to schedule a hair appointment without human involvement, certain situations require human interaction.

For instance, a parent calling to book an appointment with an oncologist for a child with cancer would expect to speak to a human who can listen and respond with empathy.

In such cases, human intervention is crucial, highlighting areas where AI should not replace human judgment.

EBIT developing AI

Professor Steyn highlighted the faculty’s efforts to develop AI for less widely supported languages, like African languages.

Within the computer science department, there is a dedicated initiative focused on addressing the challenges faced by less-developed languages in AI.

While AI is highly proficient in English, thanks to the vast amount of English-language data available, many African languages are underrepresented.

As a result, AI struggles with these languages, limiting its effectiveness for speakers. Steyn emphasises that if a language is not part of an AI’s database, it is effectively excluded from AI-driven solutions.

To bridge this gap, there is a specialised centre that collaborates with other institutions across Africa and internationally.

The Centre for Artificial Intelligence Research (CAIR) is a South African national research network that conducts foundational, directed and applied research into various aspects of Artificial Intelligence.

CAIR has nodes at five South African universities: the University of Cape Town, the University of KwaZulu-Natal, North-West University, the University of Pretoria, and Stellenbosch University.

Their goal is to expand AI’s capabilities to include African languages, ensuring greater inclusivity and accessibility in AI solutions.

Overall, Steyn connects AI to making a difference in the community. “That’s another aspect where we incorporate the whole of AI large language modules and the networks in terms of the day-to-day work,” he said.

“This is done to make a difference to the community because, in the end, that’s what we are trying to do with our graduates,” said Steyn.

UP’s EBIT has also been incorporating AI into its research. One “cool” project involves the automated detection of crocodiles from aerial images using deep learning and synthetic data.

Researchers are developing machine-learning models that can identify crocodiles. These models could potentially eliminate the need for dangerous manual counts in rivers and aid wildlife management.

Another example is the work on crowdsource-based detection of anomalies on roads.

This project explores using dashcam footage and machine learning to automatically identify potholes and other road issues reported by everyday drivers, making road maintenance more efficient and improving safety.

Show comments

Latest news

More news

Trending news

Sign up to the MyBroadband newsletter