Whoops, another professional gets called out for inappropriate use of ChatGPT. This one is particularly ironic in that it was a ‘renowned expert on misinformation’.

He was giving $600-an-hour expert testimony to a court case involving a law regarding the ‘Use of Deep Fake Technology to Influence an Election’. Legal testimony of course often involves citing documents. Which this guy did. But unfortunately his citations were nothing other than made up hallucinations courtesy of ChatGPT 4o.

From the Stanford Daily:

The error occurred when he asked GPT-4o to write a short paragraph based on bullet points he had written. According to Hancock, he included “[cite]” as a placeholder to remind himself to add the correct citations. But when he fed the writing into GPT-4o, the AI model generated manufactured citations at each placeholder instead.

This of course is far from the first time something like this happened in court. Check your work, folk!