Recently, I wrote an article about my journey in learning about robots.txt and its implications on the data rights in regards to what I write in my blog. I was confident that I wanted to ban all the crawlers from my website. Turned out there was an unintended consequence that I did not account for.
My LinkedIn posts became broken Ever since I changed my robots.txt file, I started seeing that my LinkedIn posts no longer had the preview of the article available. I was not sure what the issue was initially, since before then it used to work just fine. In addition to that, I have noticed that LinkedIn’s algorithm has started serving my posts to fewer and fewer connections. I was a bit confused by the issue, thinking that it might have been a temporary problem. But over the next two weeks the missing post previews did not appear.
The page seems written to perhaps suggest it but doesn’t explicitly say the other bots can’t feed into some other sort of AI training. It would be in Google’s interest to mislead the users here.
Edit: I found a quote where it says Googlebot does both in one: “Google-Extended doesn’t have a separate HTTP request user agent string. Crawling is done with existing Google user agent […]” and I guess Cloudflare doesn’t trust Google to abide by the access controls. That seems sensible to me.
It would be a lot to write, if you had to say what something does not do rather than what it does.
I looked at what the Cloudflare CEO said again. To be fair to him, he is not actually backing you up. He’s saying that Google makes no difference between the AI overview and the other search results. That is true. The AI overview is a search feature. I’m not sure why someone would want their link listed in search but not appear much more prominently in the AI overview.