Using ChatGPT “hallucinations” in Family Law Applications
As the use of Artificial Intelligence (“AI”) increases, there is much debate in the legal field as to how it can and should be used.
In a recent case, Zhang v. Chen, 2024 BCSC 285, Ms. Chong Ke, the lawyer for the Applicant Father submitted a family law application in which “hallucination” cases were used. Hallucinations in AI stem from results that are inaccurate or simply do not exist. In this case, Ms. Ke submitted cases that did not exist and were generated by ChatGPT. Ms. Ke was advocating for her client’s right to take his children to China. Her client, the Applicant Father, Wei Chen, lives and works in Shanghai. The children are Chinese nationals but reside in West Vancouver with the Respondent Mother. The couple’s divorce case settled in China in 2018.
On December 6, 2023, Ms. Ke submitted two case summaries in her materials as a response to the Applicant Father’s parenting application. When the Respondent Mother’s lawyers contacted Ms. Ke’s team for further information on the cases, it was discovered that they were hallucinations and did not exist.
Although Justice David Masuhara did not believe that Ms. Ke intended to deceive the court, she was ordered to personally compensate the opposing legal counsel for the time they spent clarifying the error. Ms. Ke formally apologized with a letter and signed an affidavit indicating that she was not aware of the risks that ChatGPT posed. In considering the rules and guidance prescribed by the Law Society of British Columbia, Justice Masuhara concluded that is would be prudent for Ms. Ke to advise when materials include AI generated content. Justice Masuhara concluded by stating that the professional expertise of lawyers cannot be substituted by generative AI. This case serves as a reminder that lawyers have an ethical obligation to ensure that the materials they put before the court are accurate.
Want to know more? Please contact us at 416 927 0891 or at hello@dicksonappell.com