X is acting to comply with UK law, Keir Starmer tells MPs amid Grok row

Share

Elon Musk's AI chatbot Grok will no longer be able to edit photos of real people to put them in revealing clothing after backlash against the tool in the US and UK.

"We have implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing such as bikinis," the company said in a statement.

The move seems to have come as response to a threat by the Californian top prosecutor issued on Wednesday.

California Attorney General Rob Bonta said: "This material, which depicts women and children in nude and sexually explicit situations, has been used to harass people across the internet."

"This restriction applies to all users, including paid subscribers," reads the announcement on X, which operates the Grok AI tool.

The change comes after the British Government's repeated criticism of Elon Musk and X for allowing the tool to be used in this way.

Speaking in the Commons today, Sir Keir Starmer said: “The actions of Grok and X are disgusting, and they’re shameful.

“Frankly the decision to turn this into a premium service is horrific, and we’re absolutely determined to take action.

“X has to act, and if not Ofcom has our full backing.

“I have been informed this morning that X is acting to ensure full compliance with UK law. But we’re not going to back down. We will strengthen existing laws and prepare legislation if it needs to go further.”

The Government announced last week that generating sexual images without consent will be made illegal.

Technology Secretary Liz Kendall said the criminal offence would be brought into force this week under the Data (Use and Access) Act passed by Parliament last year.

However, a spokesman for the Department of Science, Innovation and Technology said while the legal steps to introduce the offence were being made this week, it would not come into force until February.

The spokesman said: "The ban will come into force in early February, 21 days after being signed – as is standard practice.

"But platforms already have a legal duty to stop the proliferation of these images under the Online Safety Act.

"X doesn't need to wait for the Ofcom investigation to conclude."

Nudification apps will also be criminalised as part of the Crime and Policing Bill, which is currently going through Parliament, and it will become illegal for companies to supply tools to create non-consensual internet images.