

I’ve been thinking about setting up Anubis to protect my blog from AI scrapers, but I’m not clear on whether this would also block search engines. It would, wouldn’t it?
Canadian software engineer living in Europe.
I’ve been thinking about setting up Anubis to protect my blog from AI scrapers, but I’m not clear on whether this would also block search engines. It would, wouldn’t it?
As someone else said here, programmers are not a monolith. However, I’ve seen it multiple times on the job and in social media where programmers are using these tools to write code voluntarily. The code produced is often garbage, and I have to reject it at review time, but there are a lot of programmers using these things willingly.
Points for the exceptional choice of name.
I have much the same:
The only difference is that I’m using a Synology 'cause I have 15TB and don’t know how to do RAID myself, let alone how to do it with an old laptop. I can’t really recommend a Synology though. It’s got too many useless add-ons and simple tools like rsync never work properly with it.
Yeah this was a deal-breaker for me too.
Well presumably there are at least some performance and safety benefits to using these new alternatives. Otherwise it’s just a blatant license dodge.
Yes. Tailscale is surprisingly simple.
# systemctl start tailscale
# tailscale up
Debian should fork it and re-license it under the GPL.
In addition to the excellent examples posted here that refute this, I want to add “Last Exile”, “Wonderful Days”, and “Chrono Crusade”.
I have been in precisely this position. The pain is real.
Thank you. You just made my day.
I had the same reaction until I read this.
TL;DR: it’s 10-50x more efficient at cleaning the air and actually generates both electricity and fertiliser.
Yes, it would be better to just get rid of all the cars generating the pollution in the first place and putting in some more trees, but there are clear advantages to this.
It was Castlevania that did this to me.
I’m quite happy with EuroDNS. They even include free email housing if you want it.
It’s a rather brilliant idea really, but when you consider the environmental implications of forcing web requests to ensure proof of work to function, this effectively burns a more coal for every site that implements it.
Philosophy Tube recently did a great video on exactly this. It’s on Nebula for those who subscribe.
Sure, but what you “get done” on your own is statistically irrelevant. To achieve useful, measurable success in the fight against climate change, collective action must be taken at scale. That’s government.
This has strong Alfur energy.
This all appears to be based on the user agent, so wouldn’t that mean that bad-faith scrapers could just declare themselves to be typical search engine user agent?