When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

The UK government is working on rules to increase the transparency of AI training data

Westminster Bridge and Big Ben at night

The UK government’s culture secretary, Lucy Frazer, said that the creation of rules over AI transparency will be accelerated to help protect content creators. She said that the rules will mean AI firms have to be more transparent about what content was being used to train their AI models, give users the option to opt in or out of data collection for training purposes, and remuneration for content creators whose works are used for training models.

The Financial Times discovered the plans in an interview with Frazer. While she outlined the general plan, she wouldn’t share details about how rights holders could check if their material was being used by AI models.

People close to the matter have said that the government will bring forward proposals before the election which is due in autumn. This will allow stakeholders to respond to the proposals and offer up any suggestions or changes before they begin the process of being passed into law.

The Financial Times reported that the European Union is preparing similar rules under its AI act. Under this act, AI developers will need to provide a sufficiently detailed summary of content used for training and implement policies to ensure they’re operating in law with the EU’s copyright law.

AI companies like OpenAI have already anticipated what governments will do and have been securing agreements with data providers. In recent weeks, OpenAI has announced partnerships with Reddit, Stack Overflow, and the Financial Times to help it train its models.

The move will be welcomed by rights holders who claim that their rights are being violated, however, for users of these AI models, it could lead to a decrease in quality thanks to knowledge gaps. AI companies claim that they can use the data under fair use rules because its use of the data is transformative.

Ultimately, we will have to wait for the courts and the politicians to catch up to find out whether opinion comes down on the side of rights holders or AI developers.

Source: Financial Times

Report a problem with article
A keyboard with red backlight
Next Article

For years, fake remote IT workers in U.S. companies were earning millions for North Korea

The OpenAI logo on glass windows
Previous Article

Sam Altman and Greg Brockman issue response to Jan Leike's safety concerns

Join the conversation!

Login or Sign Up to read and post a comment.

2 Comments - Add comment