Ethics in Legaltech: Addressing Concerns and Setting Standards

5 min read

Here’s what legal professionals need to know about the ethical issues around legal technology use.

Legal technology has evolved at a rapid pace over the course of the last several years. Emerging technologies are making it possible for legal professionals to automate document creation and accounts receivable, offload administrative tasks, boost productivity, cut costs, and even offer new services in new practice areas. Legaltech has come a long way, and legal practices are adopting these solutions in droves. 

(Want proof? Check out Appara’s State of Canadian Legaltech report to discover how 443 legal professionals are using technology to drive efficiency in their practices.)

But while legaltech adoption has grown quickly, so, too, have legal professionals’ ethical concerns around these new technologies. Lawyers, paralegals, and other legal professionals are bound by professional codes of ethics to behave in certain ways, and all legal professionals know there are significant consequences for skirting the rules. That’s why law firms and in-house legal departments must ensure their use of legaltech complies with regulatory and ethical requirements. Here’s what you should know about staying ethical while using legaltech.

Be Transparent When Using Legaltech

Transparency and accountability are key ethical considerations when using any legal technology, but especially artificial intelligence. AI use specifically comes with something known as the “black box problem”; namely, human users often don’t know or understand how AI systems make decisions.

In order to counter the black box problem, it’s important for lawyers and other legal professionals to be able to explain how and when they use legal technology. Which tasks are you using legaltech for? What decisions are you outsourcing to legaltech? How are you instructing your legal technology to behave? 

One easy way your firm can be transparent and accountable with its legaltech use is to prepare a legaltech best practices document. This document would explain to staff members the best ways to use legal technology, including how best to leverage generative AI in legal work, examples of generative AI prompts that can generate best results, and guidelines for ensuring the accuracy of generative AI outputs.

Consider Bias and Fairness in Legaltech Use

When leveraging legal technology – especially AI-enabled legal technology – you’ll want to make sure you avoid biasing your technology. Artificial intelligence tools can reflect human biases because they rely on human input. AI tools are sometimes trained on data that can contain social disparities, while data can also be biased in the way it’s gathered. For example, in Broward County, Florida, an AI system called COMPAS is used for predicting recidivism among convicts. However, a ProPublica investigation determined that the COMPAS system is biased against black defendants, and often incorrectly labels minority defendants as at a high risk of recidivism. ProPublica’s investigation determined that not only is the COMPAS system racially biased, it’s also wildly inaccurate at forecasting crime.

Underlying data is often the source of bias, according to McKinsey. AI models, McKinsey notes, are often trained on data that contains human decisions – and humans are inherently biased.

When leveraging AI tools, you’ll want to ensure you’re doing so in an unbiased manner. While AI companies are making early progress on tweaking algorithms to be less biased, in the meantime, it’s up to users to monitor AI outputs for bias. McKinsey notes that human judgment is a necessary part of the AI use process in order to ensure AI-supported decision-making is fair and just. AI users must be aware that AI is inherently biased and look for ways to correct that bias when it influences results.

Meanwhile, AI-enabled email software suite Levity recommends a variety of methods for countering AI bias. This includes an exercise known as “counterfactual fairness”; by changing sensitive traits like gender, race, or sexual orientation, one can determine if an AI model would make the same decision in a counterfactual situation.

Levity also recommends implementing a Human-in-the-Loop system, a method where AI users provide the artificial intelligence system with continuous feedback so that the AI system can learn and improve. 

Check Your Local Law Society or Bar Association for Guidelines

Before you start leveraging artificial intelligence tools and other legaltech in your legal practice, you’ll want to consult your local law society or bar association to determine if they have any resources that can help you implement tech use guidelines. Most law societies and bar associations should have some kind of a primer guide on how legal professionals can stay ethical while using legaltech.

For example, in February 2020, the Missouri Bar Association published an article by Melinda J. Bentley, Legal Ethics Counsel for the Advisory Committee of the Supreme Court of Missouri, regarding the ethical implications of technology in legal practice. Bentley’s article discusses several key ethical rules for lawyers and other legal professionals that should be implemented when using legal technology: Competence, confidentiality, and legal professionals’ responsibilities regarding nonlawyer assistants.

Meanwhile, the Canadian Bar Association has prepared an ebook guide for Canadian legal professionals that discusses legal ethics in the context of digital technologies. This guide covers a variety of topics including data security, appropriate use of legal technology, access to justice considerations, accessibility standards, and the ethical considerations of remote work.


Legal technology is changing the way legal work is done. From accelerating paperwork, to reducing errors in filings, to assisting with legal research and more, legaltech is making lawyers and other legal professionals more efficient and more effective at their work. However, legal technology – and AI especially – cannot operate independently. Legal professionals must implement guardrails when using legal technology and artificial intelligence to ensure their use of these tools is just, fair, and unbiased. Legal practices can stay ethical with legaltech by preparing a legaltech best practices document for internal use; one that outlines ways to avoid unintentional bias, account for AI hallucinations, and ensure effective representation for clients. By implementing robust guardrails, legal practices can leverage legaltech in ways that enhance their effectiveness and reduce costs.

Is your firm looking for innovative legaltech solutions that can boost productivity, reduce costs, and eliminate mistakes? Appara can help. Appara is the smart legal professional’s go-to platform for entity management, workflow automation, and document automation. Schedule a demo to unlock your FREE trial today and discover how much time you can save with Appara.

Join our newsletter

Engaging insights and the latest news, designed for legal professionals.

Email Newsletter Signup