With the support of a FFYE grant, Dr Lam Dinh Nguyen has been exploring the use of CoPilot as a supportive tool for students to solve mathematical problems in first-year engineering subjects. In this post, Lam discusses the insights from this process.

To help students develop skills in solving mathematical problems, I assist them by explaining the complex concepts clearly and working through mathematical examples in class. Students are further supported with comprehensive teaching materials, tutorial sessions, short instructional videos and the U:PASS service. GenAI (in our case, Copilot) provides another path to support for students alongside these practices.

Aims of our FFYE grant project

Grant team: Lam Dinh Nguyen (lead), Dee Wu, Trung Ngo, Sangharsha Bhandari, Fan Wu 

First-Year Civil Engineering students are challenged when navigating the complexity of assessment tasks that involve complex mathematical formulas and calculations. They struggle to arrive at correct answers due to insufficient understanding of the topics, incorrect adopted parameters and calculation errors, leading to incorrect answers and often incomplete submissions. Often they rely on online sources or GenAI to derive answers which are accepted without scrutiny.

This project aims to develop students’ learning and confidence in problem-solving and their use of GenAI tools, by intentionally developing students’ skills in using GenAI to evaluate, refine, and verify calculations. This approach reinforces theoretical concepts by assessing the reliability of outcomes, creating a shared academic purpose and strengthening their identity as aspiring civil engineers.

Understanding the risks of GenAI

There are several key risks that have been identified and considered while developing the assessment task with GenAI as follows: 

  • Academic integrity: There is a thin dividing line between using AI to improve student learning and academic misconduct. Students may use GenAI to obtain answers and incorporate them directly into the assessment task, which is considered academic misconduct. Additionally, students may feel confused about the intention of the exercise using AI, which may lead to a misalignment with the intended learning outcome.  
  • Reliance on GenAI: When students interact with GenAI on a regular basis, they tend to depend solely on the responses from GenAI without critical evaluation of the outputs. It is well known that the GenAI responses may produce incorrect results, especially with complex mathematical problems. Hence, the accuracy and reliability of AI output need to be carefully considered and evaluated. 
  • Equity and Access: It is well observed that students have different skills and experience in using AI. Moreover, there are many readily available GenAI tools with different capabilities, and some AI tools offer advanced solutions associated with an additional cost, which creates disparities among students. There are real concerns about data privacy and security, including sharing sensitive information in GenAI platforms. It is important to consider an appropriate, institution-approved AI tool that is available and equitable to all students.   
  • Design of the assessment task must be authentic, allowing the students to apply their learning to a real-world context. GenAI could be used to ‘support’ the students, not to solve the problems for the students, which may lead to the risk of academic misconduct. This approach ensures that the students apply judgment, critical thinking and decision-making skills. The design of the authentic assessment also requires training for tutors and lecturers who need to understand the intention and provide support to guide students in using GenAI effectively and responsibly. 

Communicating clearly with students

Clear instructional guidance is crucial for this process. Instructional guidance needs to be carefully planned to enhance student learning experiences, and very clear on what GenAI can and cannot be used for. In this case, students are allowed to use GenAI to verify the accuracy of their manually completed calculations, and students cannot copy exactly the same responses from GenAI without attempting the calculation themselves. Moreover, students are only allowed to use GenAI for certain assessment tasks when it is explicitly indicated in the assessment instructions. Students are only allowed to use Microsoft Co-Pilot, a UTS-approved GenAI tool, to ensure equity and data security.

The instructional guidance includes teaching materials and examples to train students in how to ask and refine quality prompts. It also needs to show the critical thinking process to help students assess the accuracy and reliability of the GenAI outputs, and most importantly, cross-check with the manually completed calculations.

The instructions for the assessment task need to be carefully worded to align with the intended learning outcome. The requirements must be clear and specific to the tasks. The emphasis needs to be on the effective use of GenAI to encourage critical thinking and, deeper understanding of the topics, rather than simply matching answers. Evidence of using GenAI, including the direct outputs and steps to resolve discrepancies, must be documented to show the development of critical thinking and problem-solving skills. Additionally, a reflection on the use of GenAI in the assessment needs to be included.

Learnings from the process

Students can get timely support from the GenAI as quickly as possible, including explanations, a step-by-step solving procedure. As students interact with GenAI on a regular basis, they quickly realised that GenAI does not always provide correct or sensible answers. This limitation acts as a checkpoint for students to remind them to double-check their manually completed answers or refine the prompts. The habit of self-review encourages students to be critical thinkers, which boosts confidence in their answers and their ability to use GenAI effectively. 

Below are a few reflections and adjustments to make GenAI integration more meaningful and sustainable:  

  • Some students were reluctant to use GenAI due to uncertainty and not understanding GenAI’s limitations.  
  • Some students who used GenAI more effectively showed stronger understanding and problem-solving skills, while some still relied on GenAI for answers without critical evaluation.  
  • This could be improved by reinforcing the use of GenAI early in the subject, such as developing a module specifically for using GenAI, together with more examples and further instructions. Improve on training in using GenAI responsively, including how GenAI works and its limitations, and teach prompt engineering. 
  • The intention of GenAI use in assessment tasks must be clearly communicated to the learners. Therefore, an adjustment could be rewording of ‘verification’ to move away from a matching answer exercise toward ‘validation of process’ or ‘double checking’.
  • Continue seeking feedback from the students and provide timely feedback to the students on the use of GenAI in assessment tasks.
  • The teaching team meets regularly to discuss and share observations on the effectiveness of AI in an assessment task and identify strategies for improvement. 

Join the discussion

Skip to toolbar