The UK's Children's Commissioner is calling for a ban on AI deepfake apps that create nude or sexual images of children, according to a new report. It states that such "nudification" apps have become so prevalent that many girls have stopped posting photos on social media. And though creating or uploading CSAM images is illegal, apps used to create deepfake nude images are still legal.<br />  "Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone — a stranger, a classmate, or even a friend — could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps." said Children’s Commissioner Dame Rachel de Souza. "There is no p [...]
The UK’s media regulator has opened a formal investigation into X under the Online Safety Act. "There have been deeply concerning reports of the Grok AI chatbot account on X being used to creat [...]
Although X removed Grok’s ability to create nonconsensual digitally undressed images on the social platform, the standalone Grok app is another story. It reportedly continues to produce “nudifiedâ [...]
At Meta Connect 2025's kickoff event, Mark Zuckerberg unveiled a trio of new smart eyewear, including its first model with augmented reality. Meta's boss also announced the second generation [...]
Malaysia and Indonesia are the first countries to block Grok, claiming that X’s chatbot does not have sufficient safeguards in place to prevent explicit AI-generated deepfakes of women and children [...]