The UK is working on rules for training AI models with creative work

Stay tuned for free updates

UK ministers are working on plans to make more transparent how tech companies train their artificial intelligence models after creative industry representatives raised concerns about the copying and use of works without permission or payment.

Culture Secretary Lucy Fraser told the Financial Times that the government will present its first attempt to create rules for the use of material such as TV programmes, books and music by artificial intelligence groups.

Fraser said ministers would initially focus on providing more transparency about what content AI developers used to train their models, essentially allowing the industry to see if the work it produces is being stolen.

Rishi Sunak’s government is torn between the competing goals of strengthening the UK’s position as a global AI hub and protecting the country’s world-leading creative industries sector. A general election expected this year, with Sunak’s conservatives trailing in opinion polls, is also likely to limit the work of ministers and officials.

Fraser said in an interview that she recognized that artificial intelligence was a “huge challenge, not just for journalism, but for the creative industries.”

“The first step is to just be transparent about what they’re doing [AI companies] use. [Then] there are other issues that are of great concern to people,” she added. “There are questions about inclusion and exclusion [for content to be used], reward. I work with the industry on all these things.”

Fraser declined to say what mechanisms would be needed to ensure greater transparency so rights holders could understand whether content they created was being used as input to AI models.

The increased transparency of rapidly developing technologies will mean that it will be easier for rights holders to track intellectual property infringements.

People close to Labor said the government would try to make proposals ahead of an election expected in the fall. Asked about the timeline, Fraser said she was “working with the industry on all of these things.”

Executives and artists in music, film and publishing are concerned that their work is being unfairly used to train artificial intelligence models being developed by technology groups.

Last week, Sony Music called on more than 700 developers to reveal the full source of their AI systems. In a scathing letter, the world’s second-biggest music group underlined its refusal to use its music in connection with the training, development or commercialization of artificial intelligence systems.

The EU is already preparing to introduce similar rules under its AI Act, which will require developers of general-purpose AI models to publish “sufficiently detailed” summaries of the content used for training and implement a policy to uphold the bloc’s copyright law.

In contrast, the UK has been slow to develop similar regulations. Officials acknowledged the conflict between ministerial ambitions to attract fast-growing AI companies to the UK with a more lenient regulatory environment and ensuring that companies in the creative industries are not exploited.

An attempt to create a voluntary set of rules agreed between rights holders and AI developers failed last year, prompting officials to reconsider next steps.

Fraser said the government wanted to create a “framework or policy” around transparency, but noted “very complex international issues that are fast-moving”. She said the UK needed to ensure a “very dynamic regulatory environment”.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top