Grok, xAI’s flagship image model, is now being widely used to generate nonconsensual lewd images of women on the internet. When a woman posts an innocuous…
There was an article a few weeks ago about a developer who used a standard research AI image training dataset and had his Google account locked out when he uploaded it to Google Drive. Turns out it has CSAM in it and it was flagged by Google’s systems. The developer reported the data set to his country’s reporting authorities and they investigated the set and confirmed it contains images of abuse.
There was an article a few weeks ago about a developer who used a standard research AI image training dataset and had his Google account locked out when he uploaded it to Google Drive. Turns out it has CSAM in it and it was flagged by Google’s systems. The developer reported the data set to his country’s reporting authorities and they investigated the set and confirmed it contains images of abuse.