Is it just me, or does that seem … abrupt?
The timeline is not super abrupt, especially for architectures where all he is asking is to ensure that your Rust toolchain is in order. That is especially true when you consider that Rust is already well maintained on all the Debian architectures that people actually use.
The abruptness (almost rudeness) is in the second part where he basically says that, if you cannot get Rust going in time, you should just stop releasing Debian for that architecture.
It is mostly just poorly worded though. Because none of these architectures have “official” support even now. This will not be the only way they are behind. So, there is not reason to be so dramatic.
And that would be my response to him. Another option here is that these alternative architectures just continue to ship an older version of APT for now. Emergency avoided. Few of them ship with up-to-date versions of APT even now.
Another solution is to use one of the multiple projects that is working to make Rust code build with the GCC compiler back-end. At least one of these projects has already announced that they want to work with these Debian variants to ensure that APT builds with them.
So, the 6 month timeline is a reasonable impetus to make that happen which would be quite a good thing beyond just APT.
There are many other useful tools written in Rust you are going to want to use on these architectures. It will be a fantastic outcome if this pressure from APT kickstarts that happening for some of these long abandoned architectures (by the mainstream at least).
There’s time until March for the maintainers of the 3 niche architectures to organize and make rust available for them. Doesn’t sound that abrupt to me
For small niches, six months can be rather aprupt.
My niche can take 5 days or 5 months, depending on ADHD
Wasn’t there a Rust-to-C compiler that would circumvent this limitation?
Yes. There is also a GCC front-end for Rust (does not go to C first).
For package maintainers, it’s reasonable to expect security updates are rolled out the same week that a vulnerability is found. If you can’t deploy a new version of a package in 6 months, not maintaining the package is also a valid option.
but this is not a vulnerability, but adding a cpu architecture to a programming language
Rust adds another layer of trusting the compiler isn’t backdoored. All UNIX/Linux systems use the gcc toolchain, so having it written in C would mean less dependencies for the OS.
- paraphrased from someone who seemed to know more about this stuff in this unrelated discussion yesterday
Strange times.
This seems relevant:
yes very, and I hate it. rustic metal is never good ffs!
So thats why whenever I try to find a package in apt, I have to iterate through thousands of simiparly named
librust-{dictionaryword}-{component}-devpackages in order to find the simplecomponentI want… Apt repos have really been trying too hard for granularity, I’m pretty sure there are more librust áckages than actual end-user program packages.Still gonna use Debian.
Yeah, it’s pretty good. But now that I’ve started studying, and don’t have time for learning it (I’ve literally made a personal wiki with various documentation for myself), so I am thinking to switching to atomic distro instead (maybe KDE Linux).
Tells something about apt. Not because Rust bad or anything, but because Rust is more like C++ than C.

oh man, and here I was about to finish migration to Debian.
Are you migrating to Debian on a niche CPU architecture? If not, this doesn’t affect you.
currently running on a P2 333mhz. I guess it’s pretty common though.
Didn’t Debian drop i386? Are you running Debian Bookworm?
bookworm? that’s a stretch…😉
That joke’s so funny, it’s making me a bit wheezy…
This shoehorning of Rust feels more artificial than Wayland’s. 🤔
I’m pretty certain I know how you feel about systemd, too.
You can think yhat Wayland adoption was artificial, bit X.Org is unmaintained software and no developers are picking up reigns of X11. X is dead.
Some devs with stong opinions have forked X11:
Ah those bigots who hide behind “everybody is welcome”. DEI a discriminatory policy my ass…
Exactly
Yeah but his patches are so bad they almost all needed reverting so…
Uhm, what?
Wayland has been in the works for more than a decade. Granted, there’s some people having issues with it, with propietary hardware (nVidia) and not-so-common setups like two monitors, but it happens that they are the most noisy. For the rest of us it’s been great, stable, and feels snappier than X.
If you want to talk about shoehorning stuff into Debian, talk about systemd.
not-so-common setups like two monitors
wat.jpg
I assume “weird two-monitors setups” that are not so common, not two-monitor setups as a whole, as Wayland works perfectly with two monitors. It even works way better than X11 if your monitors are different, like if only one has VRR or if both monitors need different scaling.
Exactly my thoughts. What does this joker even mean? I regularly use 2-3 monitors, and have used four in certain roles. Almost everyone I know that really uses their machine has, at minimum, two screens.
the ONLY thing I can think of is sometimes, at least for me, on wayland it will switch the naming on my second monitor between either DP-1 or HDMI-A-1 randomly for whatever reason. bit of a very minor pain if I’m using a WM where I have to go in and edit the config to switch it but on KDE it’s not an issue. that’s literally the only thing I can think of.
Second that. The person just needs to pull a cable into a cheap second-hand screen he/she bought and it’s pretty much done, so I can’t see why it wouldn’t be common.
Yeah that’s quite silly. Every single employee at my office is issued 2 monitors to go with their company laptop. People working from home get the monitors shipped to them. It’s the standard setup in tons of offices as well as for many home users.
Explaining, changes happening too abruptly feel artificial. Wayland’s been around for a while, sure, but it was barely adopted and then a lot of people started insisting on it overnight.
There is this really strange perception amongst Wayland critics that it had low market share and nobody was using it.
The majority of Linux desktop users are on Wayland and we still have people posting that nobody is using it or even that it “doesn’t work”.
Wayland switched to the default in places where it was already popular and is becoming required in places where few are switching away from the default.
Maybe cause Apt is slow?
edit: maybe i have a placebo effect or i am miss remembering :PApt feels like one of the faster package managers. dnf is slow, yum is snail speed, zypper is slow as fuck too. Apt and Pacman is by far on the faster side
Alr thanks for the correction.
Dnf5 is absolutely not slow.
Moreover, apt’s output is God awful. How hard is it to put each package on its own line when doing an upgrade? Its commands are also esoteric (Madison?)?
It works and I like Debian but apt is very Meh.
Compared to what?
I know redhat has a new package manager which name I can’t recall now but before it was rpm with yum and holy crap that was molasses slow
or maybe a placebo effect or i miss remembering :P
I have strong doubts that rust could significantly speed up a software that’s written in C or C++.
Rust is generally not going to outperform well optimized C code.
That said, it is far easier to write performant Rust code than C code. So, what we see, is that projects that move to Rust frequently see performance gains.
That just means the initial C code was not that great (performance wise). From observation, most C projects are fairly unoptimized.

















