As generative artificial intelligence (AI) becomes more advanced and uses more content for training, the question of whether intellectual property is fair play comes into sharp focus.
The main sectors impacted by generative AI using its content are the creative sectors including film, music, TV and news platforms.
Michael Park, a partner at global law firm Dentons, said intellectual property is a large revenue source for many of these industries.
“There are some interesting studies that show that organisations with intellectual property rights will generate, on average, 29% higher revenue per employee than organisations without their intellectual property rights, and also patents,” Park tells Azzet.
“Beyond copyright, patents exhibit a premium over intellectual property rights holding organisations of 26% trademarks, 29% and design, 31% so these are the sorts of revenues that are at stake when it comes to AI Copyright infringement.”
Across the world, there have been many different responses to the advent of AI content from lawsuits to updated legislation being put in place.
Past lawsuits and controversy
Last year, bodies representing thousands of creatives in the UK rejected the Labour’s government’s plan to create a copyright exemption to help AI companies train their algorithms.
The Creative Rights in AI Coalition (Crac) said copyright laws must be respected and enforced rather than degraded and asked for the onus to be put on generative AI developers to seek permission, agree to license and pay rights holders if they wish to train their algorithms with their copyrighted content.
In response to the British Government’s debates surrounding AI training, OpenAI responded by begging them to use the copyrighted works.
OpenAI is one of the leaders in generative AI technology and as of October 2024, was worth US$157 billion.

OpenAI submitted evidence to the House of Lords communications and digital committee. They said they wouldn’t be able to function and make revenue only using public domain content.
"Limiting training data to public domain books and drawings created more than a century ago might yield an interesting experiment, but would not provide AI systems that meet the needs of today's citizens,” the company wrote in the evidence filing.
The company also says they believe "legally, copyright law does not forbid training."
This came after OpenAI faced lawsuits from book publishers and the New York Times for allegedly illegally using their content to train the AI.
In a press release about the lawsuit, The Authors Guild said the median income for a full-time author in 2022 was barely over US$20,000 including book and author-related activities. They said that while 10% of authors earn more than the median, generative AI threatens to decimate the author profession.
“This case is merely the beginning of our battle to defend authors from theft by OpenAI and other generative AI,” the Authors Guild president and a class representative, Maya Shanbhag says.
While the New York Times lawsuit, alleging that OpenAI and Microsoft used millions of articles to train its AI, didn’t have a specific monastery demand. However, the newspaper said the defendants should be held responsible for “billions of dollars in statutory and actual damages”.
Record labels also filed copyright infringement lawsuits in the US against AI music apps, Suno and Udio. These lawsuits were for US$150,000 for thousands of tracks where copyright was allegedly infringed.
Legislation and the law
The first AI regulation came in the form of the EU AI Act. The act tackles all risk levels and even has transparency requirements for copyright and AI.
Generative AI must disclose that content was generated by AI. This will include designing the model to prevent illegal content and publishing summaries of copyrighted data used for training.
While Australia is yet to implement mandatory AI guardrails, there are currently 10 voluntary AI safety guardrails. One pertains to copyright and transparency.
Park says Australia is also looking to implement mandatory guardrails, similar to the EU AI Act and compliant with the Australian Copyright Act.
“The Copyright Act in Australia is going to apply to content that might be developed or might be used in AI solutions.”
Senior lecturer in business and law at RMIT University, Christina Platz, said while there haven’t been any local cases of AI copyright infringement as of yet, she says she could see the Copyright Act being adapted as they arise.

She says this could occur through case law when judges make law through their decisions that apply to future cases of similar nature.
“I do think that like when there is a case that's going to come up in Australia, then case law will respond to this challenge,” she says.
“I guess we'll see if it's then necessary to make any amendments to the Act."
Platz says there is a debate among academics about how licensing could be used.
“Some academics are looking at whether we should have a collective licensing system in Australia that targets machine learning, but we just don't have that right now,” she says.
To help businesses embrace the new technology, Platz says countries will need to make their stances and legislations clearer.
“Thinking about commerce internationally, well, we might be at a disadvantage if our businesses aren't embracing this new tool,” she says.
As for the AI companies themselves, Park says that it is becoming increasingly difficult for them to regulate their content. This is due to different rules in different jurisdictions.
“It is becoming increasingly difficult for AI developers to stay compliant with the law,” he says.
“I think we’ll see over the next five years a bit more alignment globally when it comes to regulation of AI, which will make life easier for AI developers.”