How frequently are images generated/modified by diffusion models uploaded to Wikimedia Commons? I can wrap my head around evaluating cited sources for notability, but I don’t know where to start determining the repute of photographs. So many images Wikipedia articles use are taken by seemingly random people not associated with any organization.
So far, I haven’t seen all that many, and the ones that are are very obvious like a very glossy crab at the beach wearing a Santa Claus hat. I definitely have yet to see one that’s undisclosed, let alone actively disguising itself. I also have yet to see someone try using an AI-generated image on Wikipedia. The process of disclaiming generative AI usage is trivialized in the upload process with an obvious checkbox, so the only incentive not to is straight-up lying.
I can’t say how much this will be an issue in the future or what good steps are to finding and eliminating it should it become one.
How frequently are images generated/modified by diffusion models uploaded to Wikimedia Commons? I can wrap my head around evaluating cited sources for notability, but I don’t know where to start determining the repute of photographs. So many images Wikipedia articles use are taken by seemingly random people not associated with any organization.
So far, I haven’t seen all that many, and the ones that are are very obvious like a very glossy crab at the beach wearing a Santa Claus hat. I definitely have yet to see one that’s undisclosed, let alone actively disguising itself. I also have yet to see someone try using an AI-generated image on Wikipedia. The process of disclaiming generative AI usage is trivialized in the upload process with an obvious checkbox, so the only incentive not to is straight-up lying.
I can’t say how much this will be an issue in the future or what good steps are to finding and eliminating it should it become one.
How would you know if an image is AI generated? That was easy to do in the past, but have you seen what they are capable of now?