Predictive justice: failsafe or false promise?
Justice in trial
The determination (or perception, depending on the case) of justice in France is rarely inspiring. The European Commission for the Efficiency of Justice’s (CEPEJ) 2018 report on the efficiency and quality of justice in France is not particularly flattering, be it in terms of the resources available to the courts, the lack of speed at which rulings are handed down from the courts of first instance (potentially due to the aforementioned lack of resources?), the online accessibility of information for litigants, or the court rulings themselves. The Odoxa survey conducted in May 2019 at the request of the National Bar Council is no more reassuring: it shows, among other things, that seven in ten French citizens find it increasingly difficult to access the legal system.
What’s more, a survey carried out in June 2019 by the Institut français d'opinion publique (Ifop) on behalf of French legal tech company Doctrine shows that the majority of lawyers want a legal system that functions better and is more transparent.
Many people rightly insist that we should have full access to rulings handed down by the French courts. This view is underpinned by Articles 20 and 21 of the French Digital Republic Act of 7 October 2016, which state that court decisions must be made available to the public free of charge. Transparency, however ‒ sometimes elevated to the status of dogma ‒ only makes sense if it is of social and/or economic benefit. Is this really the case here, when the ‘public’ ‒ in the broadest sense of the term ‒ does not necessarily have the ability to decipher court rulings? Still, such transparency is useful to the legal professions and to certain legal tech companies: court rulings are their bread and butter. To conduct proper ‘analysis’, they need enough data. And data is a prerequisite to any form of ‘predictive justice’, if this is, indeed, the correct term.
Other than economic actors, who base their strategies or tactical decisions on legal uncertainties (or, at least, legal ambiguities, or ‘grey areas’ that can create opportunities for them), legal certainty is an important principle in any legal system, as it protects citizens from the negative side-effects of the law (for example, complex or changing norms). The judicial power has a considerable role to play here. As Maria-Isabel Garido Gomez, lecturer in Philosophy of Law at the University of Alcalá, Spain, puts it:
"The central elements of legal certainty as a basis for the predictability of judicial decisions are legal certainty and efficiency, as well as the absence of arbitrariness."
Court rulings must, therefore, have certain “predictability” characteristics. When courts (or sometimes the same court) deliver contradictory or extremely different rulings, even though the factual and legal data are similar or even identical, legal certainty is lost, as is the trust of litigants and their counsels.
The term "predictive" justice can lead to misunderstandings.
Is predictive justice synonymous with legal certainty, then? To us, as it is for Magistrate Emmanuel Poinas, author of Le tribunal des algorithmes: juger à l'ère des nouvelles technologies**, ‘forecast’ is preferable to ‘prediction’.
While the Larousse dictionary of the French language considers the words ‘prediction’ and ‘forecast’ to be synonymous, it accords a meaning to ‘prediction’ that ‘forecast’ does not have – ‘divination’, ‘oracle’, ‘prophecy’. Referring to ‘predictive’ justice may make sense from a marketing point of view, but it adds nothing to the usefulness of analytical tools in matters of jurisprudence. Predictive justice is not a fortune-telling machine.
If the quantity of data (in this case, court decisions) is sufficient, then data analytics, with its computational capacity, presents us with an opportunity to identify court trends based on a number of parameters. Companies (or their advisors) ‒ because they are primarily the ones who will have access to and use such tools ‒ will thus be able to determine their level of legal or judicial risk, make an informed decision (to file a plea, seek an alternative method of conflict resolution, proceed with or withdraw a legal action) and make any necessary adjustments, in consultation with their counsels.
By analysing the past, such algorithms “augment the present”, to quote French magistrate Antoine Garapon, without really predicting the future. As we will see, they may even narrow down future options. ‘Augmented justice’ (we far prefer this term) is also useful for judges.
France’s Court of Cassation is currently launching a project to use artificial intelligence to identify jurisprudential divergence in its own decisions or those of the Courts of Appeal. In terms of legislative and regulatory power, it finds it incontrovertibly useful to be able to identify whether the legislation that judges must enforce meets its objectives of incentivisation or deterrence. Presented in this way, the idea of ‘augmented justice’ is attractive. However, we must take into account the possibility that justice will be badly augmented (and thus diminished).
Ensuring that justice remains undiminished and unmanipulated
As mentioned, first, one must have a sufficiently sizeable, if not fully comprehensive, database of case law. However, Article 33 of the French Judicial Planning Act of 23 March 2019 has put limits on the completeness of the data that can be processed, insofar as the names of judges can be kept confidential to protect privacy or for reasons of security. They cannot under any circumstances be used "for the purpose or effect of evaluating, analysing, comparing or predicting their actual or supposed professional practices". The terms ‘analysing’ and ‘comparing’ may pose something of a problem here, as users of the justice system are legitimately allowed to identify the criteria on which a court bases its ruling one way or another (including the composition of the court itself).
Conversely, attaching performative goals to augmented justice (for judges) can lead to the diminishment of justice. The use of data analytics should not force a judge into accepting conventional thinking so as to be well rated, thus creating a culture of conformity and inertia. However, non-conformist views should still be backed up by solid argument. Consistency is not conformity but, in the end, what is sought is transparency of criteria and, hence, legal certainty.
To augment justice (and justices) using algorithms or multi-agent accelerators for data science (MAADs), the algorithms must be ethically unassailable and robust. Their neutrality and transparency, as well as their evaluation mechanisms, must be guaranteed. However, this begs the question as to who should provide the guarantee and how: the state, a third-party certifier, or the invisible hand of the market?
Augmented justice is seductive because of its usefulness under certain conditions, once we get over the idea that it is somehow divinatory in character. Shedding light on the present by better understanding the past, the legal system and those who work in it can help create a less worrisome future.
** Available in French and published by Éditions Berger Levrault, the title of the book translates as The court of algorithms: judging in the era of new technology.
Photo: Unsplash/Franck V.