AI Usage in SOFTENG 281
AI Usage in Assessment
This SOFTENG281 course is an early adopter of the University of Auckland’s new “Two-lane” approach. Under the “Two-lane” approach, some assessments will allow the use of Artificial Intelligence (AI), while others will not. The purpose of the “Two-lane” approach is to provide you with clarity and transparency on the use of AI in assessment.
Each assessment in this course will be marked as Lane 1 (controlled assessments) or Lane 2 (uncontrolled assessments).
For more information about the University-wide guidelines on the use of AI in assessment, please see the University of Auckland’s:
Specifically for SOFTENG 281, the two lanes mean:
Lane 1 — Controlled assessments
- University definition: graded assessment undertaken in controlled conditions. AI is not permitted.
- In SOFTENG 281: undertaken under invigilated conditions (i.e., in a computer lab). You will not have access to AI tools or other external resources, and you will be expected to write, debug, test and run code yourself (e.g., Test 1 and Test 2).
Lane 2 — Uncontrolled assessments
- University definition: graded or ungraded assessment undertaken in uncontrolled conditions. You can use AI to assist the development of your assessment, and this use is not restricted.
- In SOFTENG 281: undertaken on your own machine. AI tools may be available, but you are still responsible for what you submit (e.g., programming assignments).
Each assessment will specify at the top which lane it belongs to.
The core rule for Lane 2 assignments
Do not use AI to write the complete solution for you.
For example, do not use GitHub Copilot as a “TAB → TAB → TAB” button to accept large chunks of code you don’t understand. Similarly, don’t ask an AI tool (Copilot Chat, ChatGPT, etc.) to produce the complete solution for you.
If you do that:
- You miss the practice you need to actually learn programming, and the assignment becomes much less valuable for your learning.
- You will be unprepared for Lane 1 invigilated practical tests, where you must write, debug, test, and run code independently (without AI).
- Many students will end up with very similar AI-generated solutions, which are likely to be flagged by plagiarism/similarity detection.
Saying “I didn’t copy, the AI generated it” is not a valid justification. This applies to any AI tool (Copilot, ChatGPT, Claude, Gemini, etc.). You are accountable for the work you submit.
Acceptable ways to use AI (learning-focused)
You may use AI tools to support your learning without outsourcing the coding. The goal is to use AI like a tutor: it can help you think, but it should not replace the practice of writing and understanding your own code.
Examples of acceptable use:
- Explain concepts, for example “what is an interface?”, “what does
overridemean?”, “how does polymorphism work?” - Help you debug for example, explain an error message, suggest where the bug might be, help you reason about edge cases, or suggest how to test a particular piece of code.
- Clarify requirements, for example, suggest a plan/checklist for how to implement a particular feature, or suggest test cases to validate a particular requirement.
- Help you learn syntax and Java library usage with small examples you then adapt yourself.
- Suggest test cases and edge cases, then you implement and run the tests yourself.
A simple self-check: if you could not reproduce the key parts yourself under Lane 1 invigilated, time-pressured conditions, you probably haven’t learned it well enough yet — and you are likely to struggle (or fail) in Lane 1 assessments. If that’s the case, you should stop and learn the underlying concept first, before using AI to help you fill in the gaps.
What you must always be able to do
Even in Lane 2 assignments, you should only submit code that:
- You understand,
- You can write yourself, and
- You can explain (line-by-line if asked), including design choices and trade-offs.
In particular, you should be able to implement the same ideas again later in a Lane 1 assessment, without AI, within the available time.
A good rule of thumb: AI can help fill small gaps in your knowledge. But if it is filling big gaps, that’s a sign you should stop and learn the underlying concept first.
