Last week’s Senate Inquiry into the risks and opportunities of AI adoption showed positive steps are being taken towards the responsible use of the technology, albeit outside the country’s creative industries, according to Australian Writers’ Guild executive director Claire Pullen.
Pullen was among the stakeholders invited to give testimony before the Select Committee on Adopting Artificial Intelligence (AI) in Canberra last week, joining representatives from the Media Entertainment and Arts Alliance, Australian Association of Voice Actors, Law Council of Australia, as well as tech companies Adobe and Atlassian.
There were also contributions from AI-based businesses, such as software engineering practice Nuvento and Xaani AI, which works across Australian Government and Defence agencies, and large enterprise organisations.
Within their respective submissions, and subsequent testimony on Wednesday, both companies highlighted the importance of a robust regulatory framework, with Nuvento noting there should be ethical guidelines “promoting transparency and accountability, and ensuring that AI systems are designed and developed with human values in mind”.
The Select Committee, established in March, will consider 236 submissions into what the uptake of AI could mean across various sectors before reporting to Parliament in September.
Pullen, who appeared before the committee on Tuesday, said the caution being shown by companies at the forefront of technology in Australia stood at odds with the approach of some of the larger AI companies toward creative work.
“[Australian AI companies] were able to give concrete examples of where they are already working and already turning a profit, whereas the reporting from the US indicates these big ‘free’ open AIs are losing money at a rate of knots, and there is no business or use case for them,” she said.
“They’ve got all the technology that exists in their models and they still aren’t very good – no one wants to pay for them.
“There’s going to be cases where AI and Large Language Models can do good things and have really specific uses but we are a long way from that in the creative industries.”
In a joint submission with Authorship Collecting Society, Australian Screen Editors Guild. Australian Production Design Guild, and Australian Cinematographers Society, the AWG emphasised the “urgent” need for AI regulation.
Among the chief concerns for the guilds are the unauthorised and unremunerated inputs to, and outputs of, generative AI, including large language models (LLMs), along with the use of automated decision-making (ADM) in games and interactive projects, and Generative Adversarial Networks (GAN) for image-based designers.
They go on to recommend the restriction of AI within the creative industries via legislation that allows artists must expressly opt-in to having their work used by generative AI platforms, while also being paid if it is used to generate output, with further royalties coming from each time audio-visual output is accessed or transmitted.
It comes as media companies grapple with the reality of AI and its rapidly growing capabilities.
While The New York Times launched a lawsuit against OpenAI and Microsoft for copyright infringement at the end of last year, claiming the two companies used its articles to train its generative artificial intelligence and large-language model systems, other businesses such as Newscorp have since inked content deals with the tech company, allowing access to current and archived content.
Pullen said the examples served to highlight the pressing nature of the issue.
“I think the problem is the AI companies just did the infringement, took all the work, and clearly decided to take the chance that no one would complain about it,” she said.
“It’s not either/or [between litigation and making a deal] but I think one of the only reasons why these companies are negotiating deals is because they got caught out and someone did decide to sue them.
“It’s the only logical response given there was such a trial about taking the copyrighted works without consent or licensing.”
She added there was a need to address the disparity between the length of the regulatory process and AI’s growth rate, putting forward the idea of a litigation fund for copyright issues.
“One of the problems with AI is not that we don’t know the work has been infringed but it’s very difficult to litigate,” she said.
“If we are able to establish an Attorney General’s copyright litigation fund, we’d actually be able to go and answer some of these questions from the government without waiting for parliament,” she said.
Read the full submission here.