Roblox, Discord, OpenAI and Google are launching a nonprofit organization called ROOST, or Robust Open Online Safety Tools, which hopes "to build scalable, interoperable safety infrastructure suited for the AI era."<br /> The organization plans on providing free, open-source safety tools to public and private organizations to use on their own platforms, with a special focus on child safety to start. The press release announcing ROOST specifically calls out plans to offer "tools to detect, review, and report child sexual abuse material (CSAM)." Partner companies are providing funding for these tools, and the technical expertise to build them, too.<br /> The operating theory of ROOST is that access to generative AI is rapidly changing the online landscape, ma [...]
Discord co-founder and CTO Stanislav Vishnevskiy wants you to know he thinks a lot about enshittification. With reports of an upcoming IPO and the news of his co-founder, Jason Citron, recently steppi [...]
The state of Louisiana is suing online gaming platform Roblox, alleging that it fails to adequately protect its majority underage user base from online predators. In the state’s lawsuit, they allege [...]
Following a wave of lawsuits alleging that Roblox doesn't provide a safe environment for its underage users, the gaming platform made a series of sweeping updates to its policies. To address rece [...]