EU member states have agreed on a position regarding online child protection legislation that doesn't force global tech companies to identify and remove child sexual abuse materials (CSAM.) This is being seen as a major victory for US tech companies like Google and Meta, according to reporting by Reuters.<br /> This new European Council language contradicts a 2023 position in which the European Parliament would have required messaging services, app stores and ISPs to report and remove CSAM materials and instances of grooming. The proposed legislation doesn't have any of that. <br /> Instead, it tasks major tech companies with assessing the risk of their services, taking preventative measures as deemed necessary. It leaves enforcement up to individual national governm [...]
The National Center for Missing and Exploited Children said it received more than 1 million reports of AI-related child sexual abuse material (CSAM) in 2025. The "vast majority" of that cont [...]
xAI, which is already facing multiple investigations around the world over widespread reports that Grok repeatedly created sexualized images of children, is now facing a class action lawsuit. Three te [...]
The UK’s media regulator has opened a formal investigation into X under the Online Safety Act. "There have been deeply concerning reports of the Grok AI chatbot account on X being used to creat [...]
Elon Musk isn't the only party at fault for Grok's nonconsensual intimate deepfakes of real people, including children. What about Apple and Google? The two (frequently virtue-signaling) com [...]
California authorities have launched an investigation into xAI following weeks of reports that the chatbot was generating sexualized images of children. "xAI appears to be facilitating the large- [...]