Health Care

The AI Lawyer: A New Practice Area and a Fertile Ground for Future Litigation | Epstein Becker & Green

[ad_1]

Searching the internet for “AI and litigation” reveals tons of results about how AI will either replace lawyers or transform the legal profession. These results are unsurprising. Since the early 2010s, articles focusing on the potential impact AI may have on lawyers have popped up every few months. But these results miss the bigger and more important picture, which is that AI likely will spawn a myriad of litigation stemming from its use. This litigation will create the rise of AI lawyers who specialize in the complexities of AI.

In fact, this year we are already seeing the rise of AI lawyers as a handful of lawsuits surrounding AI have been filed. Below is a summary of current proceedings that have been filed this year and where they stand.

AI Lawyer & Employment

It is foreseeable that one of the earliest areas for AI lawyers to tackle will be employment law. For years, companies have been implementing AI tools to help screen applicants, monitor productivity of employees, and provide learning development. (Link to survey). And in February 2023, one of the earliest cases focusing on AI and employment law was filed.

In February 2023, Derek L. Mobley filed a class action lawsuit against Workday, Inc.,  a company that offers an AI system and job-screening tool that allows its clients to hire, recruit, and onboard employees. Mr. Mobley, the class representative, has a Bachelor’s degree in finance and an Associate’s degree in Network Systems Administration. He is also an African American over the age of forty with a disability. Since 2018, Mr. Mobley has applied to at least eighty to one hundred positions to companies allegedly using Workday’s AI, all of which denied him employment. Mr. Mobley alleges that Workday, Inc.’s AI tool has a systemic and continuous pattern of discriminating against African Americans, individuals with disabilities, and individuals over the age of forty. Currently, no answer or motion for dismissal has been filed by Workday.

AI Lawyer & Copyright

In 2023, AI image generators became the focus of intellectual property litigation.

On January 13, 2023, three artists, Sarah Anderson, Kelly McJernan, and Karla Ortiz, initiated a class action lawsuit against Stability AI Ltd.; Stability AI, Inc.; Midjourney, Inc.; and DeviantArt, Inc. alleging copyright infringement, violation of the Digital Millennium Copyright Act, and various other statutory and common law violations under California law. The artists allege that Stability AI Ltd.’s program called Stable Diffusion, which creates an image based on text prompts, was improperly trained to utilize billions of copyrighted images. These images, which include the artists’ own copyrighted work, are allegedly stored and incorporated into the program as compressed copies, without the consent of or payment to those artists. Currently, these particular artists are seeking class action certification. While the defendants have requested that the court dismiss the complaint, a decision on that motion remains pending.

AI Lawyer & Defamation

AI chatbots like ChatGPT and Bard are also exposing their creators to potential litigation through AI systems “hallucinations.” Hallucination is the inclusion of factually incorrect, nonsensical, or irrelevant information in response to a question posed by a human. In the beginning of June, a case was filed in the state of Georgia focusing on an AI chatbot and its hallucination that allegedly defamed a person. 

On June 5, 2023, Mark Walters, a radio host from Georgia, sued OpenAI LLC, the creators of ChatGPT. Mr. Walters alleged that in May of 2023, a journalist named Fred Riehl interacted with ChatGPT regarding a real case called The Second Amendment Foundation v. Robert Ferguson. Mr. Riehl requested ChatGPT to summarize the accusations in the complaint. ChatGPT responded with a short paragraph stating that Mr. Walters was a defendant in the lawsuit and had been accused of embezzling funds and defrauding The Second Amendment Foundation. After receiving this summary, Mr. Riehl requested ChatGPT to provide him with a copy of the portion of the complaint containing those allegations against Mr. Walters and eventually, the full complaint. ChatGPT complied with Mr. Riehl’s request. The problem was that ChatGPT had “hallucinated” all of the claims against Mr. Walters. In reality, he had never worked for The Second Amendment Foundation nor had he been charged with embezzling or defrauding. ChatGPT had also “hallucinated” the complaint as it did not resemble the actual complaint filed in the real case. Mr. Walters alleges that ChatGPT’s communication with Mr. Riehl contained a published libel statement about him and that OpenAI LLC, as the owner of ChatGPT, should have known its communication with Mr. Riehl was negligent and that its communication was false, or recklessly disregarded the falsity.  No answer or motion to dismiss has been filed in this case yet.

These cases are among the first examples of complex litigation arising from the increasingly prevalent use of AI. But future litigation will not be limited to these areas. It is anticipated that AI litigation will spill into other areas of law including, but not limited to, personal injury as autonomous vehicle accidents increase, health care as AI is used more often to review medical records and evaluate claims, and trade secrets as employees feed AI proprietary business information. Lawyers should be forearmed for this unique practice area and the possible opportunities that may lay ahead in order to best serve their clients in this burgeoning legal landscape.

[View source.]

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also
Close
Back to top button