Evil AI in Court Proceedings

A Minnesota court threw out an expert statement because it was generated by AI and included fake references. The case involved a Minnesota Stat. § 609.771 that bans people from using deepfake media to influence elections.

Jeff Hancock, a misinformation specialist and a Stanford University communication professor, used fake article citations generated by AI to support the state’s arguments. Hancock, subsequently admitted that his declaration inadvertently included citations to two non-existent academic articles, and incorrectly cited the authors of a third article. In his defense, Hancock told Judge Provinzino that he used ChatGPT-4, while drafting his declaration and could not explain precisely how these AI-hallucinated citations got into his declaration.

As Judge Provinzino said:

“The irony. Professor Hancock, a credentialed expert on the dangers of AI and misinformation, has fallen victim to the siren call of relying too heavily on AI—in a case that revolves around the dangers of AI, no less.”

Blaming it on the AI may be this decade’s excuse replacing the dog ate my homework.

Sources:

Author: Doug Cornelius

You can find out more about Doug on the About Doug page

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.