Hallucination is the last thing you need Hallucination is the last thing you need

Hallucination is the last thing you need

Shawn Curran Shawn Curran

Co-written by Jylo CEO Shawn Curran and CTO Sam Lansley 

Abstract
The legal profession necessitates a multidimensional approach that involves synthesizing an in-depth
comprehension of a legal issue with insightful commentary based on personal experience, combined with a comprehensive understanding of pertinent legislation, regulation, and case law, in order to deliver an
informed legal solution. The present offering with generative AI presents major obstacles in replicating
this, as current models struggle to integrate and navigate such a complex interplay of understanding,
experience, and fact-checking procedures. It is noteworthy that where generative AI outputs understanding and experience, which reflect the aggregate of various subjective views on similar topics, this often deflects the model's attention from the crucial legal facts, thereby resulting in hallucination. Hence, this paper delves into the feasibility of three independent LLMs, each focused on understanding, experience, and facts, synthesising as one single ensemble model to effectively counteract the current challenges posed by the existing monolithic generative AI models. We introduce an idea of mutli-length tokenisation to protect key information assets like common law judgements, and finally we interrogate the most advanced publicly available models for legal hallucination, with some interesting results.

Was this article helpful?

0 out of 0 found this helpful

Add comment

Please sign in to leave a comment.