- cross-posted to:
- hackernews
- cross-posted to:
- hackernews
Alarmed by what companies are building with artificial intelligence models, a handful of industry insiders are calling for those opposed to the current state of affairs to undertake a mass data poisoning effort to undermine the technology.
Their initiative, dubbed Poison Fountain, asks website operators to add links to their websites that feed AI crawlers poisoned training data. It’s been up and running for about a week.
AI crawlers visit websites and scrape data that ends up being used to train AI models, a parasitic relationship that has prompted pushback from publishers. When scaped data is accurate, it helps AI models offer quality responses to questions; when it’s inaccurate, it has the opposite effect.



You can make it effectively invisible if you print the noise in ink only visible to UV cameras and even if you use black, the individual features are smaller than a fingernail so it would be hard to see.
The law makes it illegal to put anything on the plate at all, here’s an example from FL:
It would be hard to disrupt the OCR from outside of the plate area.
You could break the segmentation, the process where it draws a box around your plate and sends the image inside of the box to be OCRd by making every other surface of your vehicle detect as a license plate using the same invisible marks. I imagine you could also print a bumper sticker with noise to look maximally like a license plate and put it near your real plate to achieve the same outcome.
If you wanted more active measures you could use high lumen UV floodlights next to your plate, it would overload the sensors so they couldn’t get an image at all. The light would be invisible to human eyes but blinding to anyone using a UV sensitive device. I believe this is fine in any state as most states only restrict your ability to install blue lights to avoid confusion with emergency services.