“Falsehood flies, and truth comes limping after it, so that when men come to be undeceived, it is too late; the jest is over, and the tale hath had its effect: […] like a physician, who hath found out an infallible medicine, after the patient is dead.” —Jonathan Swift

  • 27 Posts
  • 575 Comments
Joined 1 year ago
cake
Cake day: July 25th, 2024

help-circle

  • Ignoring for a second all the controversy around the term “two-spirit”, even if we say that two-spirit is the extremely Western concept – detached from indigenous culture – of a male and female in the same body (or even just generically two genders in one body), that still doesn’t apply, because all of the entities are male. In set theory, if you keep adding the same element to the set over and over, the set doesn’t change.

    Moreover, even if there were the kind of history you’re talking about, I’m not sure why dissociative identity disorder is being brought up here, because that categorically isn’t how God as multiple entities works within the fiction of the Bible. We see God and Jesus talking to each other back and forth multiple times, and that’s not how DID works. DID – a controversial diagnosis – isn’t a sitcom where two flatmates hang out inside your mind and banter. You’re dissociating so badly that you lose continuity, but God is clearly able to work as all three just fine at the same time.



    • The Father, God, is referred to as “He” consistently thousands of times in modern translations of the Bible, and he’s either literally or metaphorically “the Father”.
    • The Son, Jesus, is unambiguously male in his earthly incarnation, and he’s either literally or metaphorically “the Son”.
    • The Holy Spirit is referred to as masculine in English translations of the Bible, while Greek translations treat the Spirit more like an object-force-of-nature type whose pronouns change at any time to coincide with the type of object describing it (e.g. “comforter” is masculine, but “spirit” is neutral) and Hebrew just sticks with the feminine pronoun of the noun “spirit”.

    If you read a modern English version of the Bible, you have three entities in one which all are all consistently identified as masculine. Trying to treat God as non-binary with regard to modern English translations is more mental gymnastics than arguing why Kris Dreemurr isn’t non-binary.

    Given this is all fiction, it’s safe to say that death of the author is in play here, namely that 99.99% of the modern Christians who’d get offended at non-binary people existing would also not think of God as non-binary even after pondering on it, because their culture and holy book categorically treat God as masculine.


  • It’s technically more money upfront, but you’re not just buying the printer itself: you’re also buying the starter ink/toner cartridges that come with the device. The starter toner gives you vastly more pages than the starter ink, and it basically never goes bad. According to Brother, the size of a starter toner cartridge is 1000 A4 pages. According to HP, their Deskjet and Envy starter cartridges print about 150 and 250 pages, respectively.

    So that higher upfront cost doesn’t just go into a better, more efficient machine; it also goes into quadruple the starting pages or more. There are people who could seriously never print more than 1000 pages, whereas the starter for a Deskjet is so small that you practically ought to buy a spare cartridge alongside the printer for when it near-immediately runs out.

    Basically, if I’m not flat-ass broke, I’m paying another $63 upfront for an XL ink cartridge from HP for one of these printers. And what’s the page yield? 430. I’m still not even near the starter toner cartridge page capacity after spending an extra $63 on ink. To me, the upfront cost of an inkjet printer is pragmatically higher unless I’m so boots-theory-of-economics broke that all I can afford is the printer unit and only print a few pages a month tops.


  • TheTechnician27@lemmy.worldtoToday I Learned@lemmy.worldTIL about Wiki.js
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    edit-2
    11 days ago

    I replied to Scrubbles, not to you, OP. If you saw it, I actually edited in “sorry for the brutal honesty, OP” at the end for just a minute because after I’d already submitted that comment, I misread something you said that made me think this was your work-in-progress hobby project (which is really sad that I could’ve thought that to begin with). I did try it here as linked below, and it’s hilariously horrendous. It’s like somebody made a bootleg Docusaurus where the contents of the page are editable and you can do a poor man’s git diff between edits and said “done, we’re wiki software now”. There are so many things wrong with this in the way of being serious, productive wiki software that I don’t even know where to begin. It’s somehow only barely less terrible than Fandom, and Fandom has 20% of the screen dedicated to actual articles and is a cancer eating away at fan wikis (plugging Indie Wiki Buddy).

    Edit: Is there not even a spot at the bottom of the page for the license the contents of the article are released under? Oh my god. Copyleft is the most singularly important aspect of a healthy, thriving wiki, and instead of telling me a license like CC BY-SA 4.0, it’s saying “Powered by Wiki.js”. I can’t. This is not a serious piece of software created by someone who’s touched a wiki in their life.


  • TheTechnician27@lemmy.worldtoToday I Learned@lemmy.worldTIL about Wiki.js
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    edit-2
    11 days ago

    Fandom uses MediaWiki just like Wikimedia projects do, and that also means it uses wikitext rather than markdown. MediaWiki is especially nice because 1) it’s something prolific editors are already familiar with, 2) it has a great WYSIWYG editor called VisualEditor, 3) it’s basically guaranteed to be rock-solid, 4) it has good support and documentation, 5) wikitext is portable to functionally any wiki (apparently except Wiki.js right now, which is genuinely unacceptable for wiki software), and 6) a lot of tools, extensions, and preferences that let you customize your editing experience are made for MediaWiki.

    Looking at Wiki.js as someone with a decade of extensive experience editing and administrating various wikis, it looks very style-over-substance. Assuming the screenshot of their docs is supposed to represent the wiki, it’s basic as all fuck in comparison to what a MediaWiki page is capable of. It’s literally just text, headers, and hyperlinks to other pages. This is something fiddling around with CSS for 20 minutes could produce.

    The sidebar has a bog-standard telescoping ToC, a standard history button (I hope that leads to a full history, anyway), a star rating system*, and a bookmark/share/print icon trio. This is baby’s first wiki. Where are the templates? Captioned images? Tables? Not all pages have to have these things, but Wiki.js gives the reader one (1) image at the top as a first impression, and it’s something totally unremarkable.

    * As someone with 25,000+ edits on Wikipedia where we actually rate articles (other wikis don’t seriously do this), I can tell you this is absolutely fucking useless. We have a rating system on Wikipedia called Stub, Start, C, B, GA, A (basically disused), and FA. This is on the talk page and is nomimally based on various criteria. Almost always, the people using it actually know what they’re doing. Here, though? You’re encouraging substituting an actual talk page discussion (which I don’t even see here) with a useless star rating. Does the star rating reset every time you make an edit in case you resolved past issues? Do the votes get a corresponding message? Will the votes mean literally anything beyond what you could already glean by looking at the page? If I can edit anonymously, can I vote anonymously? It’s just stupid fluff to make up for how utterly redundant this software is to MediaWiki.






  • OP:

    Through the years, archaeologists have found similar results at many other sites in Indonesia, India and China. As the evidence accumulates, it appears that people were able to survive and continue to be productive after Toba blew its stack. This suggests that this eruption might not have been the main cause of the population bottleneck originally suggested in the Toba catastrophe hypothesis.

    While Toba might not help scientists understand what caused ancient human populations to plummet to 10,000 individuals, it does help us understand how humans have adapted to catastrophic events in the past and what that means for our future.

    It’s a good article, and I enjoyed reading it, but did you? I think you should leave this post up, but you could instead retitle it to something like “TIL humanity survived an eruption 74,000 years ago that was 10,000 larger than the Mount St. Helens eruption”. (Also, Toba is in Indonesia in North Sumatra.)







  • TheTechnician27@lemmy.worldtomemes@lemmy.worldMaths
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    edit-2
    1 month ago

    … okay? Yes? Nobody thought otherwise? Do we now have to clarify every statement about algebra by specifying that we’re talking about an algebra over the reals or the complex numbers? Or the polynomials or the p-adic integers, whose multiplications are also commutative?

    No one would call these “n-dimensional” number systems either. The algebra for each of these operates in R1 and R2, respectively, but, like, you would describe their algebras as being over an n-dimensional vector space. It’s not wrong, but I don’t think “two-dimensional number system” is something you’d hear mathematicians say.

    This pedantic aside feels so “I just watched a 3blue1brown video and feel verysmart™” that I don’t know what to do with it. It’s good to be interested in math, but this ain’t it. Everyone knew what they meant.


  • I’d go even further: the learning curve for Rust is shallower than C/C++.

    • C is obvious: dealing with strings is a goddamn nightmare in pure C, and strings are used all the time in modern programming. Almost no guardrails for memory safety mean that an inexperienced programmer can easily run into undefined, nondeterministic behavior that makes bug hunting difficult.
    • In C++, there’s a trillion ways to do anything (which varies enormously based on C++ version), and when you make mistakes of even moderate complexity (not “missing semicolon on line 174”), compilers like gcc spit out a gargantuan wall of errors that you need to know how to parse through.
    • Rust, in my experience, gives you a much clearer “good” way to do something with some room for expression, and its compiler tells you exactly what you did wrong and even how to resolve it.

    The fact that the compiler actually guides you, to me, made learning it much easier than C/C++.


  • I don’t know how else they could react:

    And the compiler was slow, the code that came out was slow…

    The compiler is slower because it has more to check for, but “the code that came out was slow” seems like nonsense, exaggeration, or PEBCAK. Rust code is highly performant and very close to C code.

    The support mechanism that went with it — this notion of crates and barrels and things like that — was just incomprehensibly big and slow.

    Dude what? C’s build systems like cmake are notoriously unfriendly to users. Crates make building trivial compared to the ridiculous hoops needed for C.

    I have written only one Rust program, so you should take all of this with a giant grain of salt,” he said. “And I found it a — pain… I just couldn’t grok the mechanisms that were required to do memory safety, in a program where memory wasn’t even an issue!

    He doesn’t say what the program was, and the borrow checker operates by a set of just a few extremely simple rules. There’s no idea of what he was trying to accomplish or how the borrow checker impeded that.

    So my reaction as someone who cares deeply about how disastrously unsafe C is and the tangible havoc it creates in modern society:

    • I agree the compiler is slower. Honestly boo hoo. It’s slower for two very good reasons (better static analysis and better feedback).
    • The code being slower is such a minor issue as to effectively not be true. Benchmarks prove this.
    • I’m not going to take “big and slow” as a serious critique of Cargo from someone who idealizes C’s ridiculous, tedious, convoluted build system.
    • The borrow checker is trivial, and unlike C, the compiler actually gives you easy, intuitive feedback for why your code doesn’t build.

  • They struggled to deliver their ambitious mainline Linux phone on time during Covid yes, but they eventually delivered.

    And for the people who requested refunds who waited months if not never received them? Despite them moving back their timeline literal years with repeated delays? I don’t care what challenges they faced; they knowingly took people’s money and refused to give it back to them when they couldn’t deliver. It’s their responsibility to be prepared for challenges. And in some extreme edge case where they couldn’t have been prepared, it’s their responsibility to be transparent about that to the people who gave them over a million dollars (let alone purchased the product after the Kickstarter was finished). I suppose too that the pandemic affected Purism in January 2019 when they were supposed to deliver their product?

    The fact that they did is a huge win for the mobile Linux ecosystem becoming a real contender just when we need it.

    The Librem 5 is not a contender for shit. It’s so overpriced that it can only be successfully marketed to people who care so deeply about their privacy that they’re willing to use an inconvenient mobile OS, get completely boned on hardware specs, and deal with a company notorious for fucking over its customers. Purism’s behavior is a fucking embarrassment to the Linux ecosystem.

    NXP i.MX family debuted in 2013; Intel i7 family in 2008. Their phone uses a 2017 i.MX 8M Quad, the same year they crowdfunded their phone.

    That CPU is based on the ARM Cortex-A53 and Cortex-M4, launched in 2012 and 2009, respectively.

    2017 i7 computers are equally not from 2008…

    When I say “2013”, I’m not talking about the debut year of i.MX. I’m talking about the fact that you can compare this phone side-by-side with a Galaxy S4 or S5. 3 GB of RAM, 32 GB of eMMC storage, a 720 x 1440p IPS display, no NFC, USB 3.0, an 8/13 MP front/back camera (which they inexplicably call “Mpx”; good job, guys), 802.11n Wi-Fi, no waterproofing, and a shitty-ass i.MX 8M CPU. I still remember watching a trailer for the Librem 5’s continuing development, and as they were scrolling through a web browser, it was noticeably stuttering. This was years and years ago; I can’t even imagine it today.

    It still today remains one of the best ARM processors with open source drivers without an integrated baseband. It means basically any flavour of Linux can install on the device, with a significant layer of protection from carrier conduited attacks. Other modules have similar tradeoffs between performance and interoperability/security.

    I do not give even the slightest inkling of a shit try to confirm or deny this, so I’m just going to assume it’s 100% true, because it’s not relevant to the point that the spec is absolute trash and being sold for $800. If you are not absolutely married to privacy, this is not a sellable product in 2025.

    Want better specs? We either need SoC companies to release more of their drivers open source, or more people to patiently reverse engineer closed source ones.

    Actually, if I want better specs, I’m just going to go out and buy a phone that isn’t from Purism. It really sucks that it’s not open, private hardware, but Purism is such a scummy company that so wantonly fucks over their customers that I wouldn’t touch the Librem 5 even if I could justify spending $800 for that spec just for privacy’s sake.