Ethical Uses of Artificial Intelligence in the Australian Legal Profession
Posted on June 13, 2025
Artificial Intelligence (AI), and particularly Generative AI (GenAI), is revolutionising sectors worldwide, including the legal profession. GenAI systems, which can produce human-like text, images, and multimedia content, are increasingly being integrated into legal practice for tasks such as document review, legal drafting, and research. However, this transformation also introduces critical ethical, professional, and regulatory considerations. Understanding Generative AI and Its Legal Applications
GenAI refers to a class of AI technologies that generate new content based on large datasets, rather than merely classifying or analysing existing data. These systems are powered by large language models (LLMs) such as OpenAI's GPT-4, which are trained on vast quantities of text from books, websites, and other digital sources.
GenAI is being rapidly adopted in the legal industry due to its potential to automate and augment routine legal tasks. Lawyers are leveraging tools like ChatGPT, Lexis+ AI, CoCounsel (by Thomson Reuters), and Microsoft's CoPilot (amongst others) to:
- Conduct initial legal research, especially in large-scale matters where document volume is significant;
- Assist in litigation by summarising case materials, identifying relevant precedents, and building chronologies;
- Draft and review contracts, including identifying risky clauses and suggesting alternative wording;
- Streamline process-driven practice areas such as debt recovery and credit management, and wills and estates;
- Generate marketing content such as newsletters, blog posts, and opinion articles efficiently.
Additionally, some law firms have invested in creating proprietary AI tools and are experimenting with custom-trained AI models tailored to their internal databases. These proprietary models offer the advantage of privacy control and can be fine-tuned to reflect jurisdiction-specific nuances, client preferences, or internal drafting styles.
Despite these benefits, the use of GenAI introduces significant ethical and operational challenges. AI tools may produce plausible-sounding but inaccurate outputs—known as 'hallucinations'—that require careful human oversight. They may also struggle to interpret the context of human behaviour, legal nuance, or emotional complexity, which is critical in areas such as family law, criminal defence, or mediation. Therefore, while GenAI offers efficiency and scale, its use must be carefully supervised and never wholly substitute for a lawyer's professional judgment.1
Ethical Obligations Under Conduct Rules
Lawyers must balance the opportunities of AI with their ethical duties, many of which are codified in the Australian Solicitors' Conduct Rules (ASCR), especially as lawyers increasingly incorporate tools that may produce outputs beyond their direct control.
Key ethical obligations include:
- Competency and Diligence (Rule 4.1.3): Lawyers must deliver services competently and promptly. GenAI can enhance productivity, but it can also introduce errors—especially when hallucinated legal precedents or misinterpreted facts are unknowingly relied upon. Lawyers must verify the integrity and accuracy of any AI-assisted work. The use of AI cannot justify a breach in diligence or care.1
- Client Confidentiality (Rule 9): The duty of confidentiality is paramount. Inputting confidential client data into publicly available AI tools risks breach of client privacy, particularly if the AI system stores queries or trains on input data. Lawyers should only use secure, closed-source AI tools or those covered by appropriate contractual safeguards.2, 3
- Independence and Judgment (Rule 17): AI cannot replicate the nuance, ethics, or context-specific analysis required for sound legal judgment. Relying excessively on AI may compromise a lawyer’s ability to deliver truly independent advice. Lawyers must take responsibility for the final content and should treat AI-generated suggestions as supportive tools, not authoritative sources.4, 5
- Duty to the Court (Rule 19): Lawyers are officers of the court and must not mislead it. Any submission containing citations, summaries, or arguments generated with the assistance of AI must be verified manually.6 There is a growing trend of courts requiring disclosure of AI use, reinforcing the need for accuracy, transparency, and accountability.7
- Supervision of Legal Services (Rule 37): The ASCR requires appropriate supervision of legal services, which now extends to overseeing the use of AI systems. It is the responsibility of senior practitioners to ensure that junior staff and their use of AI tools are appropriately monitored.2 Policies and training should be in place to ensure AI use aligns with ethical and professional expectations.8
Judicial Guidance on AI Use
Courts across Australia are responding to the rapid uptake of GenAI by issuing practice notes and guidance aimed at safeguarding the integrity of legal proceedings while allowing for responsible innovation. These responses reflect growing concern over the misuse of AI-generated content, especially in evidentiary submissions and legal arguments.
The Supreme Court of NSW: Practice Note SC GEN 23, effective from 3 February 2025, provides one of the most detailed frameworks on AI use in legal proceedings to date. It delineates acceptable and prohibited uses of GenAI:
- Prohibited Uses: GenAI must not be used to draft affidavits, witness statements, or expert reports these must reflect personal knowledge without AI-generated content.9
- Permitted Uses: GenAI may assist with non-evidentiary tasks like chronologies, indexing, or summarising documents, subject to human verification and disclosure.10
- Disclosure: Lawyers must verify AI-generated submissions and be prepared to explain what content was AI-assisted, ensuring AI does not replace professional judgment.11
The Supreme Court of Victoria has also issued comprehensive guidelines titled Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation, which outline ethical expectations for both legal practitioners and self-represented litigants.12
This guidance aligns with national trends, emphasising transparency and ethical accountability:
- AI use must be disclosed when relevant to document reliability.
- Practitioners are accountable for AI-assisted content and must not use GenAI in evidence documents like affidavits or expert reports.
- Caution is urged regarding confidentiality, and lawyers must understand the limitations of AI tools.13, 14
On 29 April 2025 the Federal Court of Australia issued a Notice to the Profession addressing the growing use of GenAI in court processes. This Notice reinforces:
- Parties and practitioners are expected to use AI tools in a responsible manner, aligned with existing duties to the court and opposing parties;
- The Court retains the right to request disclosure of AI use where appropriate;
- A consultation process is underway to inform the development of formal Guidelines or Practice Notes that may govern AI use in the Federal Court.15
This signals a shift from passive observation to proactive governance, as Australian courts strive to balance technological integration with procedural fairness and judicial efficiency. The move to consult with the legal profession and other stakeholders illustrates the judiciary’s commitment to transparency, collaboration, and public trust.
Regulatory Perspectives: Legal Boards and Societies
Multiple Australian regulators, including the Law Society of NSW and Legal Services Boards of WA and Victoria, issued joint statements in late 2024 and 2025 reinforcing ethical AI use. Key positions include:
- Confidentiality: Avoid AI tools that retain or train on input data unless robust privacy safeguards are confirmed.
- Independent Advice: AI must never replace a lawyer’s judgment.
- Competence: Lawyers should be trained to understand AI’s strengths and limitations.
- Fee Transparency: Clients should be informed of AI use where it impacts billing. Fees must reflect the actual professional input.16
Firms are encouraged to implement internal AI governance policies and disclose them to clients to build trust and accountability.17
Key Principles for Ethical AI Use in Legal Practice
The joint regulator statements and court directives confirm a set of actionable principles that legal practitioners should apply when using GenAI tools in practice:
- Confidentiality and Data Protection – Only use AI platforms with strict data security protocols.
- Verification of Outputs – Legal citations and conclusions must be cross-checked by qualified practitioners.
- Transparency – Disclose AI use to clients, courts, and colleagues.
- Human Oversight and Responsibility – Legal accountability cannot be delegated.
- Ongoing Education and Policy Engagement – Stay updated and contribute to responsible firm-wide policies.
Conclusion
Generative AI can improve efficiency and access to justice in legal practice. However, ethical implementation is vital. Lawyers must take ultimate responsibility, verify all AI contributions, and align with the evolving landscape of professional obligations. By embedding core legal values into AI use, the profession can innovate responsibly while safeguarding justice and public trust.
Evyenia is the current Chair of the Artificial Intelligence and Data Privacy Committee, and Co-Chair of the Commercial Law Committee at the Law Society of South Australia.
Endnotes
1 Law Council of Australia, 'Australian Solicitors’ Conduct Rules 2015' (at 1 January 2024), available at
https://www.lawcouncil.asn.au/resources/australian-solicitors-conduct-rules
2 Ibid
3 Legal Services Commissioners and Law Societies, 'Statement on the Use of Artificial Intelligence in Australian Legal Practice' (December 2024)
4 Law Council of Australia, 'Australian Solicitors’ Conduct Rules 2015' (at 1 January 2024), available at
https://lawcouncil.au/policy-agenda/regulation-of-the-profession-and-ethics/australian-solicitors-conduct-rules
5 Legal Services Commissioners and Law Societies, 'Statement on the Use of Artificial Intelligence in Australian Legal Practice' (December 2024)
6 Law Council of Australia, 'Australian Solicitors’ Conduct Rules 2015' (at 1 January 2024), available at
https://www.lawcouncil.asn.au/resources/australian-solicitors-conduct-rules
7 Legal Services Commissioners and Law Societies, 'Statement on the Use of Artificial Intelligence in Australian Legal Practice' (December 2024)
8 Evyenia Walton, 'Ethical Uses of AI for Lawyers' (Presentation, Lynch Meyer Lawyers, 25 March 2025)
9 Supreme Court of New South Wales, 'Practice Note SC GEN 23 – Use of Generative Artificial Intelligence', 3 February 2025
10 Ibid
11 Ibid
12 Law Society of NSW, Legal Practice Board of WA, and Victorian Legal Services Board and Commissioner, 'Joint Statement on AI Use', 2024–2025
13 ibid
14 Supreme Court of Victoria, 'Guidelines for Litigants: Responsible Use of Artificial Intelligence in Litigation' (2024), available at
https://www.supremecourt.vic.gov.au/forms-fees-and-services/forms-templates-and-guidelines/guideline-responsible-use-of-ai-in-litigation
15 Federal Court of Australia, 'Notice to the Profession – Artificial Intelligence Use in the Federal Court of Australia', 29 April 2025
16 Legal Services Commissioners and Law Societies, 'Statement on the Use of Artificial Intelligence in Australian Legal Practice' (December 2024)
17 Law Council of Australia, 'AI Guidance to Safeguard Consumers of Legal Services' (Media Release, 6 December 2024)
This article originally appeared in the June 2025 issue of the Law Society Bulletin, published by The Law Society of South Australia. Republished with permission.
