Judge won’t punish Michael Cohen for relying on artificial intelligence.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

A Manhattan judge on Wednesday refused to impose sanctions on former President Donald J. Trump’s onetime fixer, Michael D. Cohen, after he mistakenly sent his lawyer a fake copy created by the artificial intelligence program Google Bard. Legal references were given. was being prepared by Mr. Cohen.

Attorney David M. Schwartz cited bogus cases filed in federal district court in his motion.

Judge Jesse M. Furman said the incident was embarrassing and sad, but he accepted Mr. Cohen’s explanation that he did not understand how Google Bard worked and that he did not mean to mislead Mr. Schwartz. The judge also said he did not find that Mr. Schwartz acted in bad faith.

“Indeed, it would have been grossly unreasonable for him to have provided the bogus cases for Schwartz to add to the motion, knowing they were bogus,” Judge Furman wrote of Mr. Cohen. Judge Furman wrote of Mr. Cohen, a former attorney, who said it was likely that Mr. Cohen Schwartz, the government or a court would explore the issue, “with potentially serious adverse consequences for Cohen himself.” With.”

The issue arose from tax evasion as well as campaign finance violations by Mr. Trump and Mr. Cohen. Mr Cohen pleaded guilty in 2018 and served time in prison. He was seeking early termination of court supervision of his case after he is released from prison and complies with the conditions of his release.

Judge Furman had denied Mr. Cohen’s three previous requests. In his latest plea, his lawyer, Mr. Schwartz, pointed out that his client testified for two days last fall in Mr. Trump’s New York state civil fraud trial. Mr. Cohen’s “willingness to come forward and provide truthful accounts,” Mr. Schwartz argued, “demonstrates an extraordinary level of remorse and commitment to upholding the law.”

But Judge Furman said Mr. Cohen’s testimony in the state case “actually provides a reason to deny his motion, not to grant it.” The judge cited Mr. Cohen’s testimony in a state civil case in which he admitted he lied in federal court when he pleaded guilty to tax evasion, which he now says he did not commit. .

A lawyer for Mr. Cohen did not immediately respond to a request for comment on Judge Furman’s decision.

Mr. Cohen’s reputation will be at the center of Mr. Trump’s first criminal trial, which is scheduled to begin in mid-April in Manhattan. Mr. Cohen, one of the prosecution’s star witnesses, was involved in the hush-money deal at the heart of the case, which was brought by the Manhattan district attorney’s office. Mr. Trump’s lawyers may seek to falsify Mr. Cohen’s contradictory statements in the civil fraud case and possibly Judge Firmin’s ruling. But the district attorney’s office will counter that Mr. Cohen has told many of his earlier lies on Mr. Trump’s behalf, and that he has told a consistent story about the hush money deal over the years.

Judge Arthur F. Engron, who oversaw the civil fraud trial, said he found Mr. Cohen’s testimony “credible” and imposed a crushing $454 million judgment against Mr. Trump.

In his request to end judicial supervision of his case, Mr. Cohen tried to help his lawyer, Mr. Schwarz.

Mr. Cohen said in a sworn declaration in December that he had not kept up with emerging trends (and associated risks) in legal technology and had not realized that Google Bard was a creative text service similar to ChatGPT. Can display details and specifications. Which looked real but really wasn’t.”

Mr. Cohen also said that he did not realize that Mr. Schwartz would “throw his submissions wholesale without verifying that the cases existed.”

Mr. Cohen asked Judge Furman to use “discretion and mercy.”

The case is one of several that have come to light in Manhattan federal court over the past year in which the use of artificial intelligence has tainted judicial circles. Nationally, there have been at least 15 cases in which lawyers or self-represented lawyers are believed to have used chatbots for legal research, according to UCLA law professor Eugene Volokh. Uses that enter the court. Law.

The matter came to light last year when Judge P. Kevin Castel, also of Manhattan federal court, fined two lawyers $5,000 after they cited non-existent judgments produced by ChatGPT and A brief submission full of legal references was admitted.

A series of similar cases played out in federal courts in Manhattan.

In one, a lawyer admitted citing a “nonexistent case”—Bourguignon v. Coordinated Behavioral Health Services Inc. – that he said was “suggested by ChatGPT” after his own research failed to support the argument. was making In January, the U.S. Court of Appeals for the Second Circuit referred him to a judicial panel that investigates complaints against lawyers.

And in another case, Federal District Court Judge Paul A. Angelmeyer sentenced a law firm in Auburn, New York, that publicly admitted it paid attorney fees in a lawsuit against the New York City Department of Education. ChatGPT was used to power the application.

Judge Engelmayer said the firm’s “pleading of ChatGPT as support for its aggressive fee bid is wholly and extraordinarily unpersuasive.”

The cases highlight challenges for the legal profession as lawyers increasingly rely on chatbots to prepare legal briefs. Artificial intelligence programs, such as ChatGPT and Bard (now known as Gemini), generate realistic responses by predicting which pieces of text should follow other sequences.

Mr. Cohen wrote in his announcement that he considered Bard to be a “supercharged search engine” that in the past had yielded accurate information. The cases that Mr. Schwartz found and went through were apparently “deceptions” — a term used for chatbot-generated errors.

The incident came to light in December when Judge Furman said in an order that he could not find any of the three decisions that Mr. Schwartz had cited in his motion. He ordered Mr. Schwartz to provide him with copies of the rulings or “fully explain how this motion came to cite cases that do not exist and what role Mr. Cohen played.”

Mr. Schwartz said in his declaration that he did not independently review the cases provided by Mr. Cohen because Mr. Cohen had indicated that another attorney was providing him with suggestions for the motion.

“I apologize to the court for not personally examining these cases before presenting them to the court,” Mr. Schwartz wrote.

Barry Cummins, Mr. Schwartz’s lawyer, said Wednesday, “We are pleased that the court saw this error as one that Mr. Schwartz did not make maliciously.”

Bean Protease Cooperation reporting.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment