

To build on this, it would help to install some sort of system monitoring to check temps, fanspeed, system usage and have those constantly going so OP can check for any red flags during a freeze.


Idk about that. In my case I believe my CPU was defective from the start and I lived with it because I always assumed it was my OS in some way.
If your CPU has seven years of not randomly freezing and its just doing that now then I wouldn’t suspect the CPU.
However, unless you find some clues from journalctl -xeb1 or dmesg I would assume its faulty hardware somewhere.


Last time for me it was a bad CPU. Lived with it until I upgraded my CPU and recycled the old one into a new build. Then that one was having the same issue.
Basically guaranteeing themselves the worst code source.
127.0.0.1 that should be 0.0.0.0.

Liberals make up more of the consumer class than conservatives. That highly marketable strata of people that have disposable income tend to be affluent, college educated liberals. Its why they keep winning the culture war and it drives conservatives insane.
C was incredibly disruptive 50 years ago.


That’s the most cut and dry example of copy-left license violation imaginable. This would even be a violation of a GPLv2 license. It’s like they had zero idea what that license meant.

“conman.org” falls for basic misdirection.


Automatic Mapping
If a user already exists on one or more connected servers, they can log in directly with their existing Jellyfin credentials. Jellyswarrm will automatically create a local user and set up the necessary server mappings.
If the same username and password exist on multiple servers, Jellyswarrm will link those accounts together automatically. This provides a smooth experience, giving the user unified access to all linked servers.
Really should audit the implementation of that feature. So when you first log in it automatically sends you’re credentials to every connected server?


I always thought this would make more sense to implement client side in the media player. But its probably easier to implement this way.


Not much for open source solutions. A simple captcha however would cost scrapers more to crack than Anubis.
But when it comes to “real” bot management solutions: The least invasive solutions will try to match User-Agent and other headers against the TLS fingerprint and block if they don’t match. More invasive solutions will fingerprint your browser and even your GPU, then either block you or issue you a tracking cookie which is often pinned to your IP and user-agent. Both of those solutions require a large base of data to know what real and fake traffic actually looks like. Only large hosting providers like CloudFlare and Akamai have that data and can provide those sorts of solutions.


Costs of solving PoW for Anubis is absolutely not a factor in any AI companies budget. Just the costs of answering one question is millions of times more expensive than running sha256sum for Anubis.
Just in case you’re being glib and mean the businesses will go under regardless of Anubis: most of these are coming from China. China absolutely will keep running these companies at a loss for the sake of strategic development.


Places like cloudflare and akamai are already using machine learning algorithms to detect bot traffic at a network level. You need to use similar machine learning to evade them. And since most of these scrapers are for AI companies I’d expect a lot of the scrapers to be LLM generated.


Here’s one example of a proxy provider offering to pay developers to inject their proxies into their apps. (“100% ethical proxies” because they signed a ToS). Another is BrightData proxies traffic through users of their free HolaVPN.
IOT and smart TVs are also obvious suspects.


Or your TV or IOT devices. Residential proxies are extremely shady businesses.


The problem is primarily the resource drain on the server and tarpitting tactics usually increase that resource burden by maintaining the open connections.


This is what I’ve kept saying about POW being a shit bot management tactic. Its a flat tax across all users, real or fake. The fake users are making money to access your site and will just eat the added expense. You can raise the tax to cost more than what your data is worth to them, but that also affects your real users. Nothing about Anubis even attempts to differentiate between bots and real users.
If the bots take the time, they can set up a pipeline to solve Anubis tokens outside of the browser more efficiently than real users.
Just letting AI take the wheel and hitting git commit -am 'Update project files' every few minutes.
Its like that but way too sensitive. I’ve deliberately been gentle and it will still fail especially if I set it upright with fans on the heatsink.