Let’s say there’s a camera system built due to a direct public vote and rolled out by a political party all agree defends democracy. The stated goal is catching red light violations and speeders, and it’s a popular system. As part of the functionality it reads license plates, and that is verified by a human every time, and no footage is stored if there’s no violation.
Is that system fascist? Most would say no, and it exists in many states, like California and Washington.
Then the next election, a fascist is elected, and one of the first moves is to repurpose that system to track undesirables, and now it stores a ton of footage.
Is that system now fascist? It’s the same exact system as in the previous example, it’s just being used for fascist ends, such as tracking vehicles with certain plates (e.g. Illegal immigrants, minorities, etc) Nothing has changed in the capabilities or programming of the system, the only change was when to capture footage, what people use it for, and how long to store it.
Yes, it’s theoretically possible to design a fascist system, such as an LLM that only gives fascist answers, but that’s an incredibly narrow definition.
Just because a product has a plausibly deniable use case doesn’t really mean that it’s not functionally political.
If someone creates a super invasive surveillance system and initially uses it for a seemingly benign purpose, that doesn’t mean the intention all along wasn’t more nefarious, especially if the system was practically irresistible for power structures and it’s use directly lead to authoritarianism. Like giving someone their first hit for free.
In a case like that, I would discount the benign use as a red herring, and say that the software is functionally political.
The intention can be fascist, sure, but that doesn’t mean the solution is fascist.
For example, I think it’s pretty clear that Lemmy was designed by tankies to create a safe space for tankies (why would the instances the main devs maintain be overly protective of China and Russia if it weren’t?), but that doesn’t make Lemmy “tankie,” it’s a software project that can be used by fascists, tankies, commies, anarchists, statists, etc, because it’s just a software program.
Likewise, a surveillance system can be used by a fascist government, private company to protect company secrets, government agency like the Pentagon for internal use, or even private individuals to ID who is at the door. It’s only fascist of it’s used to further fascist goals, like identifying minorities or protestors. But then, it’s still not the software that’s fascist, but the whole system, meaning how people use it and the policies in place.
The chance of a given piece of software being “fascist” is incredibly low, since it would need to act in a fascist way and only a fascist way, or only be useful for fascist ends. Like the fascist LLM example I gave, or a training simulator that is hard-coded to only present fascist ideology.
Like the fascist LLM example I gave, or a training simulator that is hard-coded to only present fascist ideology.
Right. That’s what we’re talking about.
But I think the bar is a little lower. I think it’s enough to be primarily useful for (eg) fascist goals. If it happens to have minor non-fascist uses, I don’t think that materially changes anything.
I don’t think that Lemmy is primarily useful for furthering tankie goals.
I think that privacy invading surveillance systems are primarily useful for furthering authoritarian goals, by intention or not. There are some nice alternative uses, but I think that the use case of primary importance is in service to authoritarianism, which makes it authoritarian software.
I think that privacy invading surveillance systems are primarily useful for furthering authoritarian goals, by intention or not.
And I disagree. I think this all started when we allowed things like traffic light cameras, speed cameras, and toll cameras to automatically bill based on license plate. I don’t think most would consider those to be “primarily useful for furthering authoritarian goals,” they’re merely there for routine law enforcement with specific goals.
Flock cameras are basically that same exact system, but instead of only being used when something tangible is triggered (red light, radar, or toll booth motion sensor), they passively collect information. Flock is a private company that sells its surveillance services to cities (and private orgs) to assist with tracking down license plates or alerting when there’s a gunshot detection. This is allegedly legal because you don’t have any expectation of privacy when you’re in public (hence why Ring doorbells are legal), and private companies don’t have to follow the same rules as law enforcement. I personally don’t think Flock’s founders are fascist, they seem to genuinely want to help reduce crime. I worked for a similar company that mostly did perimeter security (i.e. generally only operated on private property), and the founder was absolutely not fascist, but they did want to help reduce crime.
I personally don’t consider either of those systems fascist by nature, but they can be used to achieve fascist goals. Tracking burglars across neighborhoods doesn’t sound especially fascist to me, but tracking protestors certainly does. These are very dangerous technologies that can easily be used for fascist purposes, so I think we shouldn’t allow them to be used at all, not because they are fascist, but because they can easily be used for fascist ends just by changing conventions around its use.
I don’t think we need to label a system as authoritarian or fascist to oppose them, we just need to point out how easily they can be misused.
The extension of the argument I’m making (and maybe them kinda?) is that it’s functionally the same as if the software were political.
You can make software that nearly exclusively benefits a particular political belief for family of beliefs.
So even if it’s not actually technically political, it can be functionally political, at which point the argument is splitting hairs.
I think those are important hairs to split.
Let’s say there’s a camera system built due to a direct public vote and rolled out by a political party all agree defends democracy. The stated goal is catching red light violations and speeders, and it’s a popular system. As part of the functionality it reads license plates, and that is verified by a human every time, and no footage is stored if there’s no violation.
Is that system fascist? Most would say no, and it exists in many states, like California and Washington.
Then the next election, a fascist is elected, and one of the first moves is to repurpose that system to track undesirables, and now it stores a ton of footage.
Is that system now fascist? It’s the same exact system as in the previous example, it’s just being used for fascist ends, such as tracking vehicles with certain plates (e.g. Illegal immigrants, minorities, etc) Nothing has changed in the capabilities or programming of the system, the only change was when to capture footage, what people use it for, and how long to store it.
Yes, it’s theoretically possible to design a fascist system, such as an LLM that only gives fascist answers, but that’s an incredibly narrow definition.
Just because a product has a plausibly deniable use case doesn’t really mean that it’s not functionally political.
If someone creates a super invasive surveillance system and initially uses it for a seemingly benign purpose, that doesn’t mean the intention all along wasn’t more nefarious, especially if the system was practically irresistible for power structures and it’s use directly lead to authoritarianism. Like giving someone their first hit for free.
In a case like that, I would discount the benign use as a red herring, and say that the software is functionally political.
The intention can be fascist, sure, but that doesn’t mean the solution is fascist.
For example, I think it’s pretty clear that Lemmy was designed by tankies to create a safe space for tankies (why would the instances the main devs maintain be overly protective of China and Russia if it weren’t?), but that doesn’t make Lemmy “tankie,” it’s a software project that can be used by fascists, tankies, commies, anarchists, statists, etc, because it’s just a software program.
Likewise, a surveillance system can be used by a fascist government, private company to protect company secrets, government agency like the Pentagon for internal use, or even private individuals to ID who is at the door. It’s only fascist of it’s used to further fascist goals, like identifying minorities or protestors. But then, it’s still not the software that’s fascist, but the whole system, meaning how people use it and the policies in place.
The chance of a given piece of software being “fascist” is incredibly low, since it would need to act in a fascist way and only a fascist way, or only be useful for fascist ends. Like the fascist LLM example I gave, or a training simulator that is hard-coded to only present fascist ideology.
Right. That’s what we’re talking about.
But I think the bar is a little lower. I think it’s enough to be primarily useful for (eg) fascist goals. If it happens to have minor non-fascist uses, I don’t think that materially changes anything.
I don’t think that Lemmy is primarily useful for furthering tankie goals.
I think that privacy invading surveillance systems are primarily useful for furthering authoritarian goals, by intention or not. There are some nice alternative uses, but I think that the use case of primary importance is in service to authoritarianism, which makes it authoritarian software.
And I disagree. I think this all started when we allowed things like traffic light cameras, speed cameras, and toll cameras to automatically bill based on license plate. I don’t think most would consider those to be “primarily useful for furthering authoritarian goals,” they’re merely there for routine law enforcement with specific goals.
Flock cameras are basically that same exact system, but instead of only being used when something tangible is triggered (red light, radar, or toll booth motion sensor), they passively collect information. Flock is a private company that sells its surveillance services to cities (and private orgs) to assist with tracking down license plates or alerting when there’s a gunshot detection. This is allegedly legal because you don’t have any expectation of privacy when you’re in public (hence why Ring doorbells are legal), and private companies don’t have to follow the same rules as law enforcement. I personally don’t think Flock’s founders are fascist, they seem to genuinely want to help reduce crime. I worked for a similar company that mostly did perimeter security (i.e. generally only operated on private property), and the founder was absolutely not fascist, but they did want to help reduce crime.
I personally don’t consider either of those systems fascist by nature, but they can be used to achieve fascist goals. Tracking burglars across neighborhoods doesn’t sound especially fascist to me, but tracking protestors certainly does. These are very dangerous technologies that can easily be used for fascist purposes, so I think we shouldn’t allow them to be used at all, not because they are fascist, but because they can easily be used for fascist ends just by changing conventions around its use.
I don’t think we need to label a system as authoritarian or fascist to oppose them, we just need to point out how easily they can be misused.