The head of the ACT's barrister association has issued a stark warning that legal professionals might dangerously "revert to using ChatGPT" for their work if they are not provided with proper training on artificial intelligence tools. This could lead to severe consequences, including the generation of fictitious legal cases or breaches of client confidentiality.
Upskilling essential to avoid 'unwanted headlines'
ACT Bar Council president Prue Bindon is actively encouraging the territory's approximately 200 barristers to upskill in AI. Her goal is to prevent damaging media exposure or findings of contempt of court that could arise from improper use of the technology.
"The risk if we don't get on top of this and in front of the curve on it, barristers will ... possibly revert to using ChatGPT for a bit of stuff, which is where some of your issues will come from, maybe breaching confidentiality or getting cases that are hallucinated or just wrong," Ms Bindon stated.
She emphasised that the legal community's fear of AI is often based on "very simplistic media reports" about its incorrect use. The Bar Council's mission is to "calm the fear and actually give some proper information and training."
Powerful tool demands adoption
To address this need, the ACT Bar Council has signed an 18-month agreement with AI Legal Assistant to provide specialised training sessions for local barristers. Ms Bindon revealed that currently, only a small number of ACT barristers are using AI in their practice, noting that some still rely on hard-copy diaries and minimal typing.
She stressed the transformative power of the technology, warning that "if barristers don't get on board with this, they will be left behind." She added that any barrister who fails to start using AI will eventually be competing against many colleagues who have integrated it into their workflow.
The push for training comes amid growing scrutiny of AI use in Australian courts. There have been several high-profile instances where the technology has produced "hallucinations" – false citations to non-existent legal cases.
Judicial caution and future guidance
In February 2024, ACT Justice David Mossop criticised the "clearly inappropriate" use of AI to draft a character reference in a fraud sentencing case. At the opening of the 2025 legal year, ACT Chief Justice Lucy McCallum described the vision of AI-generated legal submissions in a virtual court as "chilling," though she affirmed the irreplaceable role of human advocates.
Conversely, ACT Law Society president Rob Reis, speaking at the same event, acknowledged that AI is "here to stay" and that the profession must embrace it. Later, in the ACT Bar Council's November bulletin, Justice McCallum wrote that it is critical for both practitioners and judges to understand the benefits and risks of AI as the technology evolves.
Ms Bindon clarified that core court documents, such as witness statements and affidavits, would not be permitted to be authored by AI. The ACT Supreme Court is expected to publish a guiding practice direction on AI use by legal practitioners in early 2026.
In her personal practice, Ms Bindon uses AI as an analytical tool to identify inconsistencies in documents, which helps her build legal arguments and prepare for cross-examination. "This product... can very, very quickly take your own documents, compare, contrast, tabulate, and give you essentially your cross-examination framework within a few minutes," she explained. "It is like you've got this trained person sitting there."