How copyright law could threaten the AI industry in 2024

The year 2023 marked a significant milestone for Artificial Intelligence (AI) as its impact on the world became increasingly noticeable. However, in 2024, the introduction of AI has resulted in possible changes to United States copyright laws, as companies such as Microsoft-backed OpenAI, Meta Platforms, and Midjourney face lawsuits filed by copyright holders. These lawsuits claim that the companies have utilised their work to train their AI systems without permission or compensation.

Despite the claims, judges have expressed scepticism towards the plaintiffs’ arguments, citing that humans do not create AI-generated content. Nonetheless, the more complex question of whether AI companies infringe on a massive scale by training their systems with large amounts of data from the internet, including images and writings, has not yet been addressed.

The plaintiffs are seeking court orders to block the misuse of their work and monetary damages. However, technology companies argue that their AI training processes are similar to how humans learn new concepts, and their use of the material qualifies as “fair use” under copyright law.

The outcome of these lawsuits could have significant implications for the AI industry. If the courts rule against the companies, it could create significant roadblocks for the industry’s growth. Conversely, if the plaintiffs win, companies may need to pay for using copyrighted materials to train their AI models.

The ongoing legal dispute between Thomson Reuters and Ross Intelligence could serve as a precedent for future cases involving AI copyright issues. Thomson Reuters has accused Ross Intelligence of unlawfully copying thousands of “headnotes” to train an AI-based legal search engine. The case could be a turning point for AI copyright litigation, and a jury may commence hearing the issue as early as next August.

Source: Reuters


Posted

in

,

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *