Tech Explained: Here’s a simplified explanation of the latest technology update around Tech Explained: Australia backs AI licensing under copyright rules in Simple Termsand what it means for users..
Australia’s creative and media industries gathered at Parliament House to argue that AI training should be licensed under existing copyright rules. The event brought together ministers, parliamentarians, officials and industry executives.
The event, Powering Intelligence: Media, Culture and the Future of Innovation, was backed by organisations across music, publishing, screen, television and news media. Attorney-General Michelle Rowland and Senator Sarah Henderson delivered opening remarks, followed by a panel of industry and academic speakers on the policy choices facing Australia on artificial intelligence and copyright.
The debate comes as governments and rights holders scrutinise how AI developers use copyrighted material to train models. In 2025, Australia decided not to create a broad text and data mining exception, leaving commercial AI companies to seek permission to use protected works.
That decision was central to supporters of licensing at the Canberra event. Rowland said the government did not intend to dilute copyright law in response to pressure from AI developers.
“The Government has been clear for some time that there are no plans to weaken copyright protections when it comes to AI. This includes explicitly ruling out a Text and Data Mining exception, which I was extremely proud to announce last year,” said Michelle Rowland, Attorney-General, Australian Government.
Industry speakers argued that paid agreements between AI companies and content owners already provide a commercial path forward. They said licensing is being used across journalism, music, publishing and visual media, and pointed to a growing list of deals in Australia and overseas.
Examples included agreements involving Google and AAP, OpenAI with The Guardian and News Corp, Merlin with Udio, and Canva with Getty Images. Major record companies including Warner Music Group, Universal Music Group and Sony Music have also entered arrangements with AI platforms.
Market Model
Jonathan Dworkin, executive vice president of digital business development at Universal Music Group, said the music industry sees AI licensing as part of a broader effort to build legal products that can compete with unauthorised use.
“We didn’t defeat piracy by turning off the internet. Ultimately, we prevailed because streamers built a better product than piracy. That’s what we hope to do with AI,” Dworkin said.
Rebecca Costello, managing director of The Guardian Australia and New Zealand, linked the issue to the economics of journalism. She said unlicensed use of reporting by AI companies risked undermining news production itself.
“We invest everything in journalism. When that work is taken and used without compensation, the impact is fewer journalists, fewer newsrooms and less public interest journalism. No market operates when you can take something for free and then charge for it. Licensing is happening and it has to, because the alternative is the erosion of the journalism that feeds these models in the first place,” Costello said.
According to Creative Australia, the creative sector contributes AUD $67 billion to the Australian economy, making the stakes broader than a narrow copyright dispute. Supporters of licensing said the issue goes to the value of locally made cultural and media output, and to whether creators and publishers are paid when their work becomes part of AI products.
Trust and Protection
Charlie Chan, a composer, pianist and creative technologist, argued that the technical means already exist to identify and license works properly. He also said the policy debate should reflect the need to protect distinctive Australian cultural material, including First Nations content.
“We actually have the technology we need to license things properly, to protect content and find where everything is. We cannot fall into a homogenised experience of all of our creativity. Australia has something truly unique – a thousand generations of First Nations culture – and we have a responsibility to protect it,” Chan said.
Edward Santow, industry professor and director of policy and governance at the Human Technology Institute, UTS, said public confidence would depend on whether government set clear market rules and safeguards around AI.
“We need government to play the role of government: to ensure there is a fair market so that organisations can participate fairly, and to protect the population. Australians have among the lowest levels of trust in the world when it comes to AI, not because we are scared of technology – the opposite, we tend to be among the earliest adopters – but because people can see when technology goes wrong,” Santow said.
The discussion in Canberra also reflected broader international movement on AI and copyright. Supporters of stronger licensing rules noted that the UK recently stepped back from a broader text and data mining exception, which they said showed governments remain cautious about weakening creators’ rights as AI systems expand.
