themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 4 days agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square101fedilinkarrow-up1598arrow-down122cross-posted to: hackernews
arrow-up1576arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 4 days agomessage-square101fedilinkcross-posted to: hackernews
minus-squarebobzer@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down4·3 days agoWhy say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?