Opinion: A legal minefield awaits companies that want to use artificial intelligence

Images are not available offline.

The increasing use of generative AI brings with it new questions about the use of these evolving technologies, particularly in the area of ​​copyright.

Martin Meissner/The Associated Press

Arjun Gupta is a law student and President of the JD-MBA Students’ Association at the University of Ottawa.

In recent years, artificial intelligence (AI) has been making waves in various industries, from healthcare to art. The rise of generative AI tools — which use vast amounts of existing data such as audio, code, images, text, simulations, and video to produce original content — is no exception. One of those tools is ChatGPT, which former US Treasury Secretary Lawrence Summers called “the most important universal technology since the wheel.”

But with this growth come new questions surrounding the use of these technologies, particularly in the area of ​​copyright.

The story continues below the ad

When dealing with the legal risks of generative AI, Canadian companies must remember that this is an evolving issue and that the laws are not yet sufficiently sophisticated to address the challenges of this rapidly evolving technology.

This month, Getty Images filed a lawsuit against London-based Stability AI, alleging that it unlawfully copied and processed millions of images from Getty’s database to create artistic works for private gain.

AI tools like ChatGPT and Lensa are spreading like wildfire online, fueling ethical debates

ChatGPT has convinced users that it thinks like a person. Unlike humans, it has no sense of the real world

This has raised alarm among Canadian companies investing in AI technologies seeking to clarify the boundaries of copyright laws in the face of such emerging AI technologies.

READ :  Save me from our robot overlords

While the Getty lawsuit primarily focuses on licensing issues, it is likely the first of many related to generative AI and its impact on copyright doctrine. In fact, the massive pool of data required to train AI tools like ChatGPT, itself built on top of Wikipedia’s full-text repository, makes it impractical for AI companies to pay for licenses for hundreds of millions of data sources.

The story continues below the ad

The question then becomes whether the risk of breaching or violating copyright laws is outweighed by the economic benefit that early adopters could reap by investing in and applying generative AI tools.

One possible way forward is for courts to extend the application of fair dealing in copyright to AI technologies.

Fair treatment currently allows the use of copyrighted works for research purposes, but not for commercial purposes. However, if machine learning were captured under a broader definition of “fair dealing” that recognized the greater benefits to society of advancing AI technologies, it could provide a safe haven for companies to continue investing in their development.

A clear example of such benefit is the successful use of AI in skin cancer detection. An AI tool trained on a data set of thousands of images recently showed the same accuracy in detecting skin cancer as a panel of experienced dermatologists. This could have significant implications for early skin cancer detection at a lower cost, particularly for those living in remote or low-income areas. However, without clarity about the legal uses of the images used in training such AI models, a chilling effect can occur, leading to fewer advances of this kind.

READ :  Notre Dame to sign Rome Call for AI Ethics, host Global University Summit | News | Notre Dame News

The story continues below the ad

Also relevant is the question of original content and the level of human involvement required for a work to be eligible for copyright protection.

In CCH Canadian Ltd. v. Law Society of Upper Canada, a 2004 Supreme Court of Canada case concerning the threshold of originality and the limits of fair dealing in copyright law, then Chief Justice Beverley McLachlin established a flexible approach to originality and held that a work must be the result of an exercise of “skill and judgment” in order to be considered an original.

AI-generated content may fail this test. It is therefore up to the courts to clarify whether the human “skill and judgment” to program a neural network of a machine before it is trained meets the requirements of the originality test.

This issue has already been raised in the UK. Section 9 (3) UrhG states: “In the case of a literary, dramatic, musical or artistic work that is computer-generated, the person who takes the necessary precautions to create the work is deemed to be the author.”

The story continues below the ad

Canadian courts would be well advised to follow suit. Without the copyright protection afforded to “original” works, Canadian companies should be aware that any new content generated by their AI tools could become vulnerable to exploitation by competitors without recourse to legal recourse were.

A possible solution for companies is to use their own datasets or to verify that all external datasets used to train their AI were legally acquired, perhaps by having a third party sign a release form that specifically states that the Training data were obtained legally. This proactive approach could help companies reduce the risks associated with using generative AI and ensure they operate within legal boundaries. However, there is still no guarantee.

READ :  The size of the artificial intelligence in cybersecurity market is

While the future of AI is undoubtedly exciting, it is equally important for companies to consider the legal implications of using these technologies so they can continue to innovate while respecting the rights of others.

The rise of generative AI serves as a wake-up call for Canadian companies to stay informed, stay ahead of the curve and be on the right side of the law. The benefits of AI are numerous, but it is important that Canadian companies know the outcome of important decisions and controversies, such as B. Getty Images, watch closely as they unfold. This enables them to take the necessary steps to protect themselves and their interests.