Equal Justice Project

View Original

The Rise of Legal Chatbots

By Lara Albert

Introduction

 

AI’s ever-increasing performance capacity makes it unavoidable. Historically, legal practice has become more efficient by slowly incorporating new technologies. Since technological development has quickened, there has been limited time for the technologies to be vetted and regulatory systems to be implemented.[1] This article argues that AI should be used to provide legal advice to the public of NZ in the same capacity as a legal advocate, but that this advice should be regulated. This is because unregulated legal advice poses a significant risk to the public.

 

AI’s ‘legal capacity’

 

The term ‘AI’ encompasses a wide range of technologies. In general, these technologies involve some level of ‘smart’ processing. For example, some AI technologies include machine-learning algorithms or human imitation learning. As such, they have the capacity to undertake legal research for the public.[2] Scholar Lisa Webley noted that technology-assisted review (TAR) technologies in the UK were efficient surveyors of the law to the extent that if a layperson used them, they could substitute a lawyer.[3] AI technologies also can produce 'smart contracts' through blockchain technology.[4] Other AI technologies employ systems that can apply the law to specific facts, indicating potential outcomes.[5] These AI technologies have developed rapidly in the past ten years. According to scholars, this "massive expansion" of AI is indicative of it becoming a “widespread” presence in the economy and social interaction.[6] AI's “legal capacity” is the first step in the argument that AI should be able to provide legal advice to the NZ public. AI does have the capacity to provide legal services to the public, and the technology is developing every day. [7]  It is prudent to start incrementally regulating AI technology so that its benefit can be promoted within a manageable framework.

 

Benefits of AI providing legal advice to the public

 

If AI legal technology is available to the public, it will generate benefits for the NZ legal system. The way to achieve these benefits while avoiding negative consequences is through proper, thoughtful regulation. This section highlights the benefits that are worth pursuing with regulation, including facilitating access to justice and creating a stronger rule of law.

 

A report undertaken by Otago University confirmed that a large proportion of NZ’s population sits in a “justice gap.”[8] The systems that NZ has in place to facilitate justice for persons in the lower income bracket are largely ineffective. The maximum income qualification for legal aid is exceedingly low: an individual without dependants cannot earn more than $23,820 (substantially below a full-time minimum wage) to qualify.[9] Additionally, while the New Zealand Law Society (NZLS) recommends that lawyers perform 35 hours of pro bono legal services annually, only 41% of lawyers achieve this.[10] Another barrier to seeking legal help is the notion that the layperson "didn't need any."[11] Even people who recognise they need a lawyer will avoid seeing one because of concerns it will be time-consuming, unpleasant and expensive.[12] In the words of Chief Justice Winkelmann, "If courts dispense justice for only the few, what does this mean for our concept that we are a nation that exists under the rule of law?"[13]

 

AI can turn the tide. AI is starting to be a presence in some overseas jurisdictions to help resolve these sorts of issues. AI services of a "DIY" nature have developed in the United Kingdom and the United States.[14] Such services technologically support clients to do legal work themselves.[15] The example 'DoNotPay' demonstrates the nature of AI public facings technologies. Platforms like ‘DoNotPay’ could work well within NZ’s legal system and should be encouraged. These platforms are innovative and ethically motivated. They make the law accessible and promote legal citizenship.[16] 

 

In 2015 entrepreneur Joshua Browder launched “DoNotPay.” His goal was to use “artificial intelligence to help consumers fight against large corporations and solve their problems.”[17] Browder’s ‘DoNotPay’ Chat bot started by assisting people with challenging their parking tickets. It had a tangible impact; from 2015 and 2017, it saved people 9.3 million dollars by challenging 375,000 parking tickets.[18] Since then, Browder has added more capabilities to the bot; now, they can help lay people fill out forms (like a pregnancy leave request) or deal with other problems (like tenancy issues).[19] The bots are accessible for the public to use. They operate with natural language; all the layperson has to do is state their problem. ‘DoNotPay’ highlights the potential public benefit that similar systems could bring to NZ.

 

AI technologies, like ‘DoNotPay,’ are often distributed online, which means they are more accessible and can be offered at more affordable rates through economies of scale.[20] By reducing the cost of legal advice, they provide access to justice. Customers will also have more control over their experience and not be intimidated by the process of seeking out a lawyer. They will be free to determine the right course of action for themselves with the assistance of AI. A typically held notion is that such technologies will displace lawyers; I would argue the contrary. Instead, these new technologies will bring more consumers to the market.[21] A growing legal market with greater demand for specialised services would compensate for the price drop for the services provided by AI. As well as providing greater access to justice, AI technology could also streamline the process of directing people to lawyers, relieving the pressure for small practitioners to go out and find clients.[22]

 

Examples of Ethical Challenges

 

Consent: The form of consent provided by users of AI legal technologies poses a difficult ethical challenge. The current format of AI legal technologies is on applications and websites. In general, when people use these, they often ignore the terms they agree to as a condition of use. The periodic updates that most applications and websites undergo make it hard to keep track of these agreements. If AI applications and websites are left unregulated, it can leave individuals vulnerable.[23] The best strategy is to impose that software cannot operate without informed consent and that all consent must be renewed and recorded. Such a task is difficult because achieving it must be accomplished differently from how one would typically achieve informed consent on a person-to-person level.[24]

Transparency: Typically, legal decisions should be motivated by and set in logic. Transparency becomes an ethical issue when technological findings are trusted despite a lack of clarity around the process generating the decision or outcome.[25] The British post office scandal illustrates this issue. Between 2000 and 2014, the British post office utilised an accounting software called Horizon.[26] This software was faulty. Due to a lack of transparency, accounting discrepancies could not be tracked back to software malfunction. Instead, using the software results as evidence, the British postal institution prosecuted hundreds of post-masters with criminal and civil charges.[27] These prosecutions resulted in numerous lost jobs, unjust prison sentences, and suicides.[28] Technological errors are difficult for humans to identify because the decision-making process of AI is not intuitive and has no commonality with human decision-making.[29] This case teaches us that the law cannot simply trust the judgement of an unverified AI process. In saying this, humans have equally made errors that have caused great injustice. However, a bias in favour of machine infallibility is equally dangerous. What is required to combat this ethical concern is to (1) acknowledge that technology makes errors and (2) use technology to complement human ability, reducing the general risk of errors.

 

Regulation

 

There should be regulation of AI legal advice when the type of advice poses a risk to the public. Under the current regulatory framework, AI could provide most forms of legal advice to the public so long as the AI did not purport to be a ‘lawyer,’ ‘solicitor,’ or ‘barrister.’[30] The NZLS does not currently regulate non-lawyers providing legal services apart from the reserved areas of work.[31] The NZLS should identify more reserved areas of law where inaccurate advice could cause harm to the public. Within these new reserved areas of law, non-lawyers (AI and human) could operate under regulation. Non-lawyers providing more complex legal advice with higher risk to the public often do so professionally for reward. Because these are professional services, they will be more responsive to regulation. The extent of this regulation would be proximate to the significance of the service provided. Thus, non-lawyers would be regulated to a lesser extent than lawyers. Ultimately, regulating these services will reduce the risk of harm to the public.

 

Conclusion

 

In conclusion, AI is a valuable tool for NZ's legal system. This article has highlighted the benefits of allowing this technology into NZ. It has also explored two forms of ethical issues related to AI and outlined the potential for regulatory options. Ultimately, AI should be able to provide legal advice to the public through a regulated framework.

 

 


[1] Lisa Webley, Ethics, Technology, and Regulation (Legal Services Board, Birmingham, 2019), at 1.

[2] At 7.

[3] At 5.

[4]Andrew Zapotochnyi “What are Smart Contracts?” (11 April 2022) Block Geeks <https://blockgeeks.com/>.

[5] Webley, above n 1, at 6.

[6] Vyacheslav Burlakov and others, “Introduction” in The Modern Trends of Development of AI Technologies (Springer, 2020, Switzerland) at xi.

[7] Samuel Dahan and David Liang The Case for AI-Powered Legal Aid (2nd ed, Queen's Law Journal, New York, 2021) at 415.

[8] Kayla Stewart, Bridgette Toy-Cronin, and Louisa Choe, New Zealand lawyers, pro bono, and access to justice (University of Otago, Otago, 2020), at 1.

[9] Carlie Dowling lawyers, “Legal aid and what do I qualify for” (6 September 2021) CarlieDowling/Lawyers <https://carliledowling.co.nz/>.

[10] Stewart, above n 8, at 2.

[11] Rebecca Sandefur What We Know and Need to Know About the Legal Needs of the Public, (SC L. REV, Illinois, 2016), at 11.

[12] At 14.

[13] Derrek Cheng, “New chief justice wants justice for all, especially the vulnerable” (17 December 2018) NZHerald <https://www.nzherald.co.nz/>

[14] Webley, above n 1, at 3.

[15] Webley, above n 1, at 3.

[16] Jordan Bigda The legal profession: From humans to robots  (Journal of High Technology, Suffolk, 2017) at 412.

[17]Joshua Browder, “Our Mission” (2021) DoNotPay <https://donotpay.com/>.

[18] J. Mannes “DoNotPay launches 1,000 new bots” (12 July 2017) Techcrunch <https://techcrunch.com/>.

[19] Browder, above n 17.

[20] Benjamin Barton and Deborah Rhode, Access to Justice and Routine Legal Services: New Technologies Meet Bar Regulators (Hastings Legal Journal, San Francisco, 2019), at 2.

[21] At 7-8.

[22] At 8.

[23] Sara Gerke, Timo Minssen, and Glenn Cohen Ethical and legal challenges of artificial intelligence-driven healthcare (Elsevier Public Health Emergency Collection, Copenhagen, 2020) at 6-7.

[24] Gerke, above n 23.

[25] At 8.

[26] Kevin Peachey, “Post Office scandal: What the Horizon saga is all about” (March 2020) BBCNEWS <https://www.bbc.com>.

[27] Peachy, above n 26.

[28] Peachy, above n 26.

[29] Gerke, above n 23, at 8.

[30] Lawyers and Conveyancers Act 2006, s 22.

[31] Section 24.

The views expressed in the posts and comments of this blog do not necessarily reflect those of the Equal Justice Project. They should be understood as the personal opinions of the author. No information on this blog will be understood as official. The Equal Justice Project makes no representations as to the accuracy or completeness of any information on this site or found by following any link on this site. The Equal Justice Project will not be liable for any errors or omissions in this information nor for the availability of this information.

Featured image source: Stockvault