Artificial intelligence (AI) tools such as ChatGPT or Copilot should not be relied on for legal information or legal reasoning. The information generated is often wrong or made up. These are called ‘hallucinations”. The AI tool might give:
- CRT or court cases that do not exist
- Inaccurate or fake legislation
- Incorrect analysis of a legal issue
Is it illegal to use AI tools in a CRT claim?
It’s an offence under the Civil Resolution Tribunal Act to give the CRT false or misleading information. Under the CRT Rules, a participant:
- Must not include fake cases or legislation in their arguments, such as those created by an AI tool
- Must not give fake evidence, such as evidence created or altered by an AI tool
What happens if I use AI tools?
Tribunal members make decisions based on the law, and the evidence and arguments from the participants. If a tribunal member finds your evidence or arguments are unreliable, you might not get the decision you want or feel you deserve. Read CRT decisions where participants used AI that generated false or misleading results.
You may also be ordered to pay money. False evidence, cases, or legislation created by an AI tool, may waste time and cause unnecessary expenses. The tribunal member may order you to pay another participant’s:
- Legal expenses
- Compensation for the time they spent
If you’re not sure whether your arguments and evidence are reliable or what evidence you should submit for your claim, you may want to get legal advice.
See the CRT Rules for more information.