Judicial Decision Making

Volume 8 • Issue 3 • March/April 2022
Main Article Image

Data in the Court

Judicial analytics in practice

In recent years, legal technology has swept the industry—in contracts, in due diligence, in patents, in legal research, and perhaps most successfully, in e-discovery. Given the proliferation of such tools across the legal profession, it should come as no surprise that they—and the technology underpinning them—are also entering courtrooms. For instance, many have written about the use of technology to aid in setting bail or recommending sentencing for convicted parties. Most recently, a whole cottage industry has formed around judge/judicial analytics. If “Judges in the Lab” attempted to get into the heads of judges by tracking document view patterns and other indicators as they decided a fictional case, judicial analytics attempts to use data culled from publicly available dockets to provide insight to clients on judges and court behavior to help determine litigation strategy. In the past decade, some large players, including Thomson Reuters and LexisNexis, as well as smaller legal tech startups like Gavelytics and Trellis, have stepped in to provide insights into judges, giving more robust and empirical sourcing into what was once mere anecdote. As these analytics increasingly become an industry standard, more law firms are finding it necessary to have judicial data on hand to inform strategy, while in-house teams are subscribing to them to inform how they select outside counsel for litigation. While there are important technical debates to be had around the technology underpinning these tools, this story focuses on how the tools that do exist are used in the marketplace.

In this article, we speak to Robert (Bob) Ambrogi, legal tech journalist and media lawyer, about why judicial analytics is a valuable tool for lawyers looking to compete in today’s profession. Then, we speak with two legal tech users—Wendy Butler Curtis, chief innovation officer at Orrick, and Kate Orr, senior innovation counsel at Orrick—about what role analytics play in the law firm space.

Judges as data points

In 2015, Bob Ambrogi wrote a column for his blog, LawSites, titled “In Litigation and Legal Research, Judicial Analytics Is the New Black,” predicting that judicial analytics would be the next big thing for the profession. Ambrogi, who has covered legal technology for decades, reported from the American Association of Law Libraries annual meeting, where three different organizations were showcasing their judicial analytics platforms: Lex Machina, Ravel Law, and ALM. In the years since, many more companies have launched to provide such services. Companies like Gavelytics and Trellis, which began with a focus on judges, have rapidly expanded their scope seeking to provide both more state judicial data as well as complementary data around the courtroom, such as expert witness analysis, opposing counsel research, and more. Established legal research players like LexisNexis, which acquired both Lex Machina and Ravel in the past few years, and Thomson Reuters’s Westlaw Edge have also launched tools focused on judges within their legal analytics suite (Context and Precedent Analytics, respectively). And many of the products have moved from simply including docket data—bare bones recorded court data such as the length of a trial, the counsel on record, the subject of the suit, and so forth—to more sophisticated content analysis. Thus, tools like Precedent and Context look at the language in court decisions, taking into account judicial citation patterns in an attempt to predict (or at least provide insight into) how a judge makes decisions. (For more on all the companies mentioned above, see Ambrogi’s podcast, LawNext, which includes interviews with the founders of Gavelytics, Trellis, and more.)

Legal analytics are “expanding our understanding of what it means to conduct legal research,” says Bob Ambrogi.

From that context, in talking to The Practice, Ambrogi says judicial analytics, broadly construed, has largely lived up to its early promise. Rather than dive into PACER or search through public state or county records—much of which may not be digitized or standard—the new tools save lawyers time and energy by providing a user-friendly interface that filters and visualizes information. By using these judicial analytics tools, Ambrogi says, lawyers can ask questions like: “Is a judge more or less likely to grant the motion for summary judgment?” and “How long does this judge usually take to decide a certain kind of a case?” And even, “What influences does that judge tend to rely on?” It’s up to the lawyer to interpret the data as they see fit—Ambrogi notes that judicial analytics cannot predict what the judge will do in any specific instance. But it may give you a suite of data points to help you try to make that judgment call. Given this, Ambrogi suggests, the profession has moved from questioning whether or not practitioners need such tools to asking, Is it malpractice not to use them if they’re available?

For Ambrogi, these platforms are game-changing because they’re “expanding our understanding of what it means to conduct legal research—that legal research is not just about cases and statutes; it’s about gaining insights into the judges and lawyers and parties involved in a matter and how we adapt those cases and statutes and our own strategies based on those insights that we’ve gained,” he says. In particular, he is fascinated by advances in content analytics—though notes there is work to do. Ambrogi says:

If you know that this judge, based on analysis of their past decisions, finds cases out of the Second Circuit to be really persuasive in deciding IP matters, then you know you’re going to focus your brief on cases decided by the Second Circuit. And if you have been able to see that this judge often tends to cite certain cases as the controlling precedent in a particular matter, then you’d be a fool not to make sure you cite those cases as the controlling precedent in your brief. If you do nothing but speak to the judge’s particular profile and propensities, these analytics can give you an edge, and in litigation, that’s significant.

What’s coming down the pipeline? Ambrogi would love to see more marriages between legal technology, citing what might happen if you were able to layer on a product like Casetext’s Compose, which creates a first draft of a brief, with litigation analytics so that your first draft  already caters to the court in which you’re arguing. But before we see any technological synergies, the data in each service probably has to improve. Studies done in 2019 and 2020 by law firm librarians compared federal litigation products, revealing stark differences between products due to the quality and quantity of their data. For instance, when asking a question like, “In how many [patent] cases has Irell & Manella appeared in front of Judge Richard Andrews in the District of Delaware [since January 1, 2007],” studies found that “no two vendors had the same answer.”

The tools might be a game-changer, but they also have much further to go to truly change the game.

According to the ABA Journal, the reason for such wild conclusions has to do with the underlying inconsistency of the data in each product. For instance, the platforms in question all drew from federal litigation data from PACER, which itself costs money—causing platforms to pick and choose what they buy within a budget—and the data within PACER itself is frequently flawed. Typos or erroneous nature of suit (NOS) codes abound, and omissions, like if counsel changed halfway through a case, may occur. As Ambrogi notes, platforms like Trellis and Gavelytics that are trying to account for even less systemized state data have even more work ahead of them.

What this means is that the tools might be a game-changer, but they also have much further to go to truly change the game. Normalizing and systemizing the data will take time and money. “They’re using AI to some extent to try to do that, help with that cleanup, but there still have to be humans involved, and where you need intensive work by humans, it’s a very expensive process,” says Ambrogi.

The user’s experience

As senior innovation counsel at Orrick, Kate Orr thinks carefully about how technology can complement and bolster the firm’s lawyers and the work that they do. About four years ago, Orrick integrated judicial analytics into CaseStream, their homegrown case management system. Every litigation matter flows through CaseStream, run by a case manager professional (typically not a lawyer) whose job is to connect the team with records, resources, and tools—including a suite of litigation and judicial analytics. “We feed the analytics to our lawyers and teams at the start of a case, or even before if they’re trying to decide which jurisdiction to file in, so they don’t have to remember to go find them,” Orr says. She continues:

When we serve up that platter of analytics, we say, “OK, you have a case in this court with this judge against this opposing counsel. Here are the analytics that are available to you from the resources that we have here at Orrick, and here’s a link where you can manipulate the analytics yourself.” Because that’s an important part of using them. You can have the report, but there is often a lot more you can visualize with the analytics through filtering and sorting. That work requires someone familiar with the facts and law at issue in the case.

Wendy Butler Curtis, Orrick’s chief innovation officer, adds that it was critical to serve the data in an easy-to-use manner. She notes, “We’ve got tech fatigue, we’ve got platform fatigue, we’ve got login fatigue, so Kate’s model helps meet people where they are, emphasizing that the people and processes are as important as the data analytics that these tools are providing.”

While both Butler Curtis and Orr stress the importance of the data in helping to inform decision making, both also note that the analytics, as they exist, are imperfect. “It’s just one thing that we think about, but it is very rarely the determining factor,” says Butler Curtis. As case in point, she notes that if a lawyer knows that 70 percent of the time a single judge denied motions to dismiss—a data point that current tools can largely determine—that percentage does not capture the complexity or difficulty of the case or argument. The analytics succeed when providing informative or illustrative data, says Orr, such as “a snapshot of the judge and her history, the breakdown of the kinds of cases she works on, her propensity to dismiss cases, the time it takes to resolve cases, and typical outcomes.” But “they aren’t as strong on strategy.” There is some information one can glean, such as the cases that judge might typically rely on that may factor into the legal strategy, but it doesn’t tell the whole story. This gap in analysis reminds Orr of how a streaming service might not perfectly understand your choices. She says:

I turn on Netflix and I watch some terrible old movie for the sake of nostalgia because I watched it when I was 10 years old. I wasn’t watching the movie because I like that genre, and yet, the next time I turn on Netflix, it’s recommending other terrible old movies to me. There’s a layer of personalization missing there that reminds me of analytics and where they fall short. There’s some data there, but we’re still working on how we can actually use it and what value it brings.

Wendy Butler Curtis and Kate Orr of Orrick stress that having those data points allows the firm and its people to plan and adapt for the known chaos.

Both attorneys also note that while some tools purport to be able to predict behavior using sophisticated content analysis of judicial decisions and opinions, there is room to improve when it comes to prediction. Nevertheless, Butler Curtis and Orr still view the tools as valuable—particularly for an unforeseen benefit: the analytics have become a tool in the firm’s commitment to well-being. When CaseStream delivers “analytics on a platter,” lawyers get data points of, for instance, court and even judge timings (for example, the data shows that this court/judge typically takes X days to rule on this type of matter). How does that impact issues of well-being? Lawyers gain a measure of predictability in an otherwise unpredictable practice area. Having this information allows the firm to both staff cases in sensible ways, including ensuring work is distributed equitably and efficiently, and give individual lawyers more control over how they allocate their time—including with respect to work/life integration. This may be particularly important given that litigation has traditionally been male-dominated.

Butler Curtis recalls an instance where such legal analytics would have proved fruitful. A junior appellate lawyer joined the trial team just as they were starting trial. She relates:

I’ll never forget, he called me in a panic asking: “How are you going into the pretrial conference? We don’t have an answer on the jury charge.” “How are you doing opening arguments? We don’t have a ruling on the motion on limine.” Each step was a stressor for him. Imagine if he had access to analytics showing that in this court it was not uncommon for the judge to begin trial with pretrial motions still pending.

Again, Butler Curtis and Orr acknowledge that such analytics are imperfect. However, they stress that having those data points, particularly during stressful and often unpredictable litigation, allows the firm and its people to plan and adapt for the known chaos. As Orr says, they can help cut down on “exercises in futility,” helping lawyers ask more informed questions, such as, “Why are we filing this motion in limine if we know this court never rules on them?”

Both Butler Curtis and Orr admit that the idea that such analytics could help mitigate burnout is something they’re figuring out as they go. And, when Orrick first heard the sales pitch from their legal technology vendors, it was an advantage that wasn’t even on the horizon. Right now, judicial analytics adds the biggest value in providing that perspective on timing.

Choose your arbitrator

Like judicial analytics platforms, similar organizations are trying to categorize and make sense of international arbitration. Unlike with judges, companies who engage in arbitration often have the benefit of choosing their arbitrator. ArbiLex, a company started by Isabel Yang in the Harvard Innovation Lab while she was a Harvard Law School SJD student, uses artificial intelligence and predictive analytics to drive data-driven decision making in the legal profession. ArbiLex not only helps legal buyers choose their arbitrator on the basis of factors like standard time to resolution but quantifies risk factors for legal financers. As noted in a Forbes article about the founder, “The main benefit of ArbiLex’s product is mapping out risk factors associated with an arbitration case. The startup’s algorithms can help benchmark and quantify probabilities, which allows users to think more probabilistically when assessing settlement outcomes as opposed to maintaining a binary mindset.”

Catherine Rogers, who is both professor of law at Penn State Law and professor of ethics, regulation and the rule of law at Queen Mary University of London, provides a different view of arbitrators within the same horizon. Her nonprofit, Arbitrator Intelligence, “is an information aggregator that collects and analyzes information about arbitrators’ track records, which parties and counsel can use to better inform their arbitrator selection,” according to their website. Rather than use machine learning to rank and categorize arbitrators, Rogers asks users of arbitration to provide qualitative feedback about their experiences, while arbitrators can record video interviews to sell their services. To date, the organization has information about over 800 arbitrators from over 125 countries in its system.

Debating the issues

Usefulness aside, judicial analysis does not come without controversy. In 2019 France banned its use, citing privacy concerns and the possibility that such scrutiny could lead to undue pressure on decision making. Yet, judges, Ambrogi believes, should not fear analytics, which can help lawyers better tailor their briefs and strategically honor a judge’s knowledge and experience. “If I’m a judge who has a lot of experience in trademark law, I don’t want some lawyer trying to write a long thesis on trademark law, trying to educate me about something that I probably know better than that lawyer,” he says.

There are larger debates at play. As Butler Curtis notes, there is a “a fundamental ground-swelling resistance to the amount of data that’s out there about each of us and how it is used,” she says. Particular to this is the idea that “people are making decisions based on this increased data when it doesn’t necessarily tell the fair and full picture,” she indicates. (For more, see “The Cost of Judgment.”)

“These tools are expensive right now—because of what is needed to create them, they’ll continue to be expensive for a while,” says Ambrogi.

Butler Curtis also worries about the data in legal analytics, but not simply because of privacy or the quality or inconsistency. What’s accounted for does not paint the whole picture and could lead to an unfair distribution of credit: “While the tools are working to improve this, the primary source of data is who signed the pleading, not who wrote the motion or did the argument.”

For Ambrogi’s part, he notes that “these tools are expensive right now—because of what is needed to create them, they’ll continue to be expensive for a while.” He goes on, “And that does potentially create an uneven playing field, both for smaller-firm lawyers, who don’t have the budgets to be using these tools in the same way as their big-firm counterparts, and even more so for pro se litigants, putting them at an even greater disadvantage.”

At the same time, in their paper, “Judging by Numbers: How Will Judicial Analytics Impact the Justice System and Its Stakeholders?” University of Ottawa law professors Jena McGill and Amy Salyzyn predict that judicial analytics could become cheaper and more widely available over time. Such a possibility opens judges up to scrutiny in ways that might prove fruitful for weeding out bias or discrimination. McGill and Salyzyn see a future in which judicial analytics is open to not just law firms willing to pay for it but also public citizens hoping to analyze, understand, and hold accountable public servants. This, they write, would be in line with many countries, like Canada, where governments have pledged further transparency.

What is the problem that we can solve with this data? Butler Curtis asks.

Other questions abound: Are legal analytics—and this type of curated assessment of judges—a natural evolution? What does it say about our judicial system? How should we think about decision making in the courtroom in general? For now, Ambrogi says, “in terms of practical impact on lawyers’ practices today, judicial analytics and broader litigation analytics are incredibly potent.”

Butler Curtis approaches it by asking: What is the problem that we can solve with this data? “Just when we think of all the ways that you would use it, you think of three more,” she says.

1 2 Single Page

Judicial Decision Making Volume 8 • Issue 3 • March/April 2022