Learning with an ‘AI buddy’


Seeing double: Imagine if every lecturer had a virtual assistant available to students 24/7. — 123rf.comSeeing double: Imagine if every lecturer had a virtual assistant available to students 24/7. — 123rf.com

Virtual teaching assistants enhance student engagement with customised support

We live in a digital world where the modern workforce increasingly relies on digital innovations to drive solutions, generate new content, and harness transformative technologies that can redefine industries.

This calls for an educational approach that prepares our graduates for a rapidly changing world, equipping them with adaptability and tech-savviness – skills highly valued by employers.

Universities, in particular, need to move away from traditional passive knowledge dissemination to dynamic, interactive learning experiences that empower students to take ownership of their educational journey.

This approach cultivates curiosity, self-directed learning, collaboration, and problem-solving skills, enabling students to thrive in a technology-driven world.

The emergence of artificial intelligence (AI)-driven technologies has transformed the way students receive personalised assistance and support, making it a revolutionary development in education.

Imagine if every university lecturer had an AI counterpart available to their students 24/7. This long-awaited goal in online learning is becoming a reality, making personalised learning accessible to every student.

These AI doubles, more commonly known as virtual teaching assistants, will “work closely” with the lecturers to handle routine tasks, such as generating quizzes, providing personalised feedback on assignments, offering immediate assistance, answering questions, and clarifying doubts in real time – allowing lecturers to focus more on direct student interaction, ultimately providing targeted support for every student. This is an example where learning with AI can be better than reality.

For students, the use of an AI buddy in learning could enhance their engagement, as they will receive customised support in various ways.

An AI buddy can create personalised content tailored to each student’s learning preferences. By integrating dynamic conversation prompts based on student responses, an AI buddy can foster a deeper understanding, encourage critical thinking, and help students connect concepts to real-world issues.

Additionally, an AI buddy can track students’ progress in the learning management system, and increase learning outcomes with enhanced and personalised feedback.

For instance, lecturers could use rubric-based AI grading to evaluate student responses in assignments, offering tailored feedback for improvement to each student.

However, adopting AI in education at the university level presents several challenges. If students do not use generative AI (GenAI) responsibly, it may lead to a compromise in the attainment of learning outcomes and eventually affect the quality of graduates.

Students need to understand the limitations of GenAI technology. It is not perfect and can produce false or illogical information.

In certain circumstances, GenAI may lack discernment and be unable to distinguish between right and wrong information. It can fabricate information and generate fictitious references to non-existent texts.

Furthermore, students are not allowed to submit work generated by AI as their own without proper attribution, as this constitutes academic dishonesty and is equivalent to cheating. The use of GenAI must be referenced appropriately according to the referencing style required by their lecturers.

Students should consult their course instructors about the expectations regarding the use of GenAI for learning tasks or assessments.

They are responsible for checking and verifying the results generated by AI against other sources, acknowledging the contribution of GenAI technology, and being accountable for any errors or omissions.

Lastly, the ethical consideration in embracing GenAI in education involves the privacy and security of students’ data. GenAI often relies on large language models and requires massive amounts of data to train the model effectively. The deployment of AI in education must prevent unauthorised or unintended usage of sensitive personal information and academic records from students.

Universities must also implement robust ethical guidelines, including obtaining informed consent from students for data usage, and ensuring students have the option to opt out of any data collection by the AI system.

It is also crucial for universities to ensure their AI systems are trained with transparent algorithms, and are regularly audited for responsible and optimal use of GenAI.

Assoc Prof Dr Lim Chee Leong is the director of Learning Innovation and Development (LID) at the Centre for Future Learning, Taylor’s University. With over 20 years of experience in the education industry, he has demonstrated a sustained and effective record at a strategic level in relation to the quality of learning innovation, academic leadership, research and services, placing the varsity at the forefront of transformational teaching innovation and creativity. The views expressed here are the writer’s own.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

education , Taylor's University , GenAI

   

Next In Education

New VC for IMU
Growing interest in science
Netball, more than a game
‘Listen up, everyone’
POWERFUL PUSH
Nod for USM medical geneticist
Empowering Johor’s talents
Twins triumph in the UK
Academic Chair named after King
‘Onus on students’

Others Also Read