Skip to content

Man Employs A.I. Avatar in Legal Appeal, and Judge Isn’t Amused

Jerome Dewald sat with his legs crossed and his hands folded in his lap in front of an appellate panel of New York State judges, ready to argue for a reversal of a lower court’s decision in his dispute with a former employer.

The court had allowed Mr. Dewald, who is not a lawyer and was representing himself, to accompany his argument with a prerecorded video presentation.

As the video began to play, it showed a man seemingly younger than Mr. Dewald’s 74 years wearing a blue collared shirt and a beige sweater and standing in front of what appeared to be a blurred virtual background.

A few seconds into the video, one of the judges, confused by the image on the screen, asked Mr. Dewald if the man was his lawyer.

“I generated that,” Mr. Dewald responded. “That is not a real person.”

The judge, Justice Sallie Manzanet-Daniels of the Appellate Division’s First Judicial Department, paused for a moment. It was clear she was displeased with his answer.

“It would have been nice to know that when you made your application,” she snapped at him.

“I don’t appreciate being misled,” she added before yelling for someone to turn off the video.

What Mr. Dewald failed to disclose was that he had created the digital avatar using artificial intelligence software, the latest example of A.I. creeping into the U.S. legal system in potentially troubling ways.

The hearing at which Mr. Dewald made his presentation, on March 26, was filmed by court system cameras and reported earlier by The Associated Press.

Reached on Friday, Mr. Dewald, the plaintiff in the case, said he had been overwhelmed by embarrassment at the hearing. He said he had sent the judges a letter of apology shortly afterward, expressing his deep regret and acknowledging that his actions had “inadvertently misled” the court.

See also  Netflix’s Reed Hastings Gives $50 Million to Bowdoin for A.I. Program

He said he had resorted to using the software after stumbling over his words in previous legal proceedings. Using A.I. for the presentation, he thought, might ease the pressure he felt in the courtroom.

He said he had planned to make a digital version of himself but had encountered “technical difficulties” in doing so, which prompted him to create a fake person for the recording instead.

“My intent was never to deceive but rather to present my arguments in the most efficient manner possible,” he said in his letter to the judges. “However, I recognize that proper disclosure and transparency must always take precedence.”

A self-described entrepreneur, Mr. Dewald was appealing an earlier ruling in a contract dispute with a former employer. He eventually presented an oral argument at the appellate hearing, stammering and taking frequent pauses to regroup and read prepared remarks from his cellphone.

As embarrassed as he might be, Mr. Dewald could take some comfort in the fact that actual lawyers have gotten into trouble for using A.I. in court.

In 2023, a New York lawyer faced severe repercussions after he used ChatGPT to create a legal brief riddled with fake judicial opinions and legal citations. The case showcased the flaws in relying on artificial intelligence and reverberated throughout the legal trade.

The same year, Michael Cohen, a former lawyer and fixer for President Trump, provided his lawyer with phony legal citations he had gotten from Google Bard, an artificial intelligence program. Mr. Cohen ultimately pleaded for mercy from the federal judge presiding over his case, emphasizing that he had not known the generative text service could provide false information.

See also  Trump Signs Order to Create a ‘Crypto Reserve’

Some experts say that artificial intelligence and large language models can be helpful to people who have legal matters to deal with but cannot afford lawyers. Still, the technology’s risks remain.

“They can still hallucinate — produce very compelling looking information” that is actually “either fake or nonsensical,” said Daniel Shin, the assistant director of research at the Center for Legal and Court Technology at the William & Mary Law School. “That risk has to be addressed.”

Leave a Reply

Your email address will not be published. Required fields are marked *