Opportunities and risks of AI in the court system


Exterior of the Beus Center for Law and Society building on ASU's Downtown Phoenix campus.

The Beus Center for Law and Society building on ASU's Downtown Phoenix campus. ASU photo

|

“Science and innovation have always been an important part of ASU Law,” said Stacy Leeds, dean of Arizona State University’s Sandra Day O’Connor College of Law. “So when Stanford University Law School and Georgetown (University Law Center) approached us to do the conference, it was a quick yes.”

"Access to Justice and AI: New Frontiers for Research, Policy and Practice” took place on Wednesday, Jan. 15, at the Beus Center for Law and Society on ASU’s Downtown Phoenix campus.

The one-day conference brought together interdisciplinary researchers, policymakers and practitioners to discuss opportunities and risks that artificial intelligence (AI) and algorithmic decision-making present for access to civil and criminal justice.

Approximately 120 people attended the conference (with 200 virtual attendees), which addressed themes such as accountability and transparency; bias and inequality; privacy, policy and regulations; and new and emerging research and programs.

The event was sponsored by ASU Law, Stanford University Law School, Georgetown University Law Center and the Access to Justice Research Initiative at the American Bar Foundation.

AI on your legal team

Rebecca Sandefur introduced the first panel, titled “AI and its Implication for Access to Justice.”

Sandefur, director and professor at ASU’s T. Denny Sanford School of Social and Family Dynamics, mentioned all of the "hype” and asked for a definition of AI and generative AI.

“We need to leave the word ‘hype’ out of any discussion of AI and generative AI,” said Gillian Hadfield, research professor at Johns Hopkins University. “Because there is a reason for the hype.”

Hadfield discussed the idea of an attorney losing a court case and asking ChatGPT to generate a well-reasoned text explaining the loss.

Colin Rule talked about using AI in his area of expertise — conflict resolution.

“We all have conflicts,” said Rule, CEO of Mediate.com. “We fight with our spouses, we fight with our kids, we fight with our coworkers.

“When we think of the challenges and the limits of human cognition ... we know a lot about the neuroscience ... and the social science that creates the dynamics of conflicts."

So what if “we can design systems that match those human dynamics ... and think of the needs of each person in the dispute. ... It could be another tool for building a pathway for each party," Rule said.

He explained that AI can be used for case triage, for summarizing complex cases and to generate documents such as settlement agreements. It can be explored as the fourth component in conflict resolution scenarios that include two lawyers and a neutral. And it can be a tool to build consensus when dealing with big social issues by creating a model that can take differing opinions and explore areas of overlap.

AI would, of course, have its limitations in the area of law.

“It’s a big leap to (think you can) take a judge off the podium and put a robot up there with robes and a gavel,” Rule said. “What goes on in the mind of a judge or arbitrator or mediator is much more subtle and complicated than just throwing a ton of data at it.”

AI, ethics and accountability

Matthew Burnett opened the second session, "Do no harm? Ethics, AI and Accountability," which explored the ethical concerns of AI, including privacy issues, entrenched bias concerns, replication (such as deepfakes) and accountability.

Burnett is a visiting scholar at ASU’s Sanford School.

“Oftentimes with technology, and particularly AI, it’s difficult to see what is around the corner because it is evolving so quickly,” said Burnett, director of research and programs for the Access to Justice Research Initiative at the American Bar Foundation (ABF).

“What is missing from the conversation about AI?” he asked the panelists.

Marissa Duarte began with the basics. She talked about the difficulty of conceptualizing and explaining something as technically abstract as AI.

“I’m trying to teach people, fundamentally, what AI is,” said Duarte, a senior Global Futures scientist at ASU. “And that’s already very difficult.”

Duarte also talked about access to technology and the internet.

“It’s fundamentally unjust that we have very large swaths of communities that can’t afford the tools and devices ... to make use of these technologies and systems,” she said.

Panelist Peter Chapman said that technology is moving so fast and with such urgency around it that “it is hard to see around the corner.”

“The issue in the legal community is around the complexity and technology, and how that is going to interact with and equip the legal community,” Chapman said.

Chapman is the associate director with the Knight-Georgetown Institute (KGI), a center at Georgetown University.

“It’s a rapidly evolving space in terms of what’s around the corner,” Chapman continued. If we wait, as a legal-and-access-to-justice community, for some of this to be hashed out, he said, “the train is going to have left the station and the systems won’t be aligned for their needs.”

Milan Markovic emphasized the need to pay attention to how AI is prompted and what data is used in that prompt. For example, when using a generative AI as a legal tool, users may not know what data to feed into the large language model (LLM).

“These are things that would escape even a person with a college education,” said Markovic, a law professor at Texas A&M University School of Law. “In the legal community, it's hard to prompt for lawyers.”

How can AI be incorporated into the court system?

That was one of the questions discussed in the final panel of the day, titled “AI in Action: New and Emerging Research, Programs, and Policies.”

David Engstrom, a professor at Stanford Law School, discussed two AI tools being developed.  One would be internal, court-facing, and the other would be external-facing.

Engstrom talked about working with the massive court system in Los Angeles and the need for AI to address high-volume work and basic processing and how to incorporate AI into court operations.

He talked about things like default judgements that are currently reviewed manually to see if default court is warranted and the man power that can be saved by developing an AI process.

“That seems like something that is imminently automatable,” he said.

AI tools would also help people who could not afford lawyers to obtain information and direction for their legal situation.

He also urged action on proposals.

“It is imperative for courts to start the process of figuring out how to incorporate these tools in all different levels of sophistication in their systems right now,” he said.

Hadfield added, “We are really at a fundamental, transformational point in history and we need to be quite thoughtful about the direction we are going in and the technology that is being built.”

With the rush to incorporate AI, Zach Zarnow urged caution, encouraging people to consider certain questions.

Zarnow said he is often asked, “How should I use AI?"

“That’s obviously the wrong question,” said the deputy managing director for Access to Justice.

“They should be asking themselves, 'What problem are we trying to solve? What inefficiencies are we trying to get rid of? What goal do we have in mind,'” he said. “And work backwards from there.”

More Law, journalism and politics

 

A stack of four pizza boxes

How to watch an election

Every election night, adrenaline pumps through newsrooms across the country as journalists take the pulse of democracy. We gathered three veteran reporters — each of them faculty at the Walter…

A group of students stand as someone talks at a lectern emblazoned with the ASU logo.

Law experts, students gather to celebrate ASU Indian Legal Program

Although she's achieved much in Washington, D.C., Mikaela Bledsoe Downes’ education is bringing her closer to her intended destination — returning home to the Winnebago tribe in Nebraska with her…

Palo Verde Blooms

ASU Law to honor Africa’s first elected female head of state with 2025 O’Connor Justice Prize

Nobel Peace Prize laureate Ellen Johnson Sirleaf, the first democratically elected female head of state in Africa, has been named the 10th recipient of the O’Connor Justice Prize.The award,…