Hiker, software engineer (primarily C++, Java, and Python), Minecraft modder, hunter (of the Hunt Showdown variety), biker, adoptive Akronite, and general doer of assorted things.

  • 0 Posts
  • 28 Comments
Joined 1 year ago
cake
Cake day: August 10th, 2023

help-circle










  • So, the web uses a system called chain of trust. There are public keys stored in your system or browser that are used to validate the public keys given to you by various web sites.

    Both letsencrypt and traditional SSL providers work because they have keys on your system in the appropriate place so as to deem them trustworthy.

    All that to say, you’re always trusting a certificate authority on some level unless you’re doing self signed certificates… And then nobody trusts you.

    The main advantage to a paid cert authority is a bit more flexibility and a fancier certificate for your website that also perhaps includes the business name.

    Realistically… There’s not much of a benefit for the average website or even small business.



  • Actually, I think they have it exactly right. The problem is Republican voters views and priorities have been misaligned with their respective party representatives for at least a decade.

    This is no more evident than in evangelical voters jumping through hoops to justify a detestable candidate of poor morals.

    What Trump, the tea party before him, etc represents to folks that adore them is quite different than what those things are.


  • So the local machine doesn’t really need the firewall; it definitely doesn’t hurt, but your router should be covering this via port forwarding (ipv4) or just straight up firewall rules (ipv6).

    You can basically go two routes to reasonable harden the system IMO. You can either just set up a user without administrative privileges and use something like a systemd system level service to start the server as that user and provide control over it from other users … OR … if you’re really paranoid, use a virtual machine and forward the port from the host machine into the VM.

    A lot of what you’re doing is … fine stuff to do, but it’s not really going to help much (e.g. building system packages with hardening flags is good, but it only helps if those packages are actually part of the attack surface or rather what’s exposed to the remote users in someway).

    Your biggest risk is going to be plugins that aren’t vetted doing bad things (and really only the VM or using the dedicated user account provides an insulation layer there – the VM really only adds protection against privilege escalation which is pretty hard to pull off on a patched system).

    My advice for most people:

    • Make a new user on the system to run each game you want to run
    • Run the game using systemd and that user
    • Use something like kopia + the root user’s crontab (easier than systemd timers, but systemd timers also work) to backup the files on disk

    For Minecraft in particular, to properly back things up on a busy server you need to disable auto save, manually force save, do the backup and then enable auto save again after your backup. Kopia can issue commands to talk to the server to do that, but you need a plugin that can react to those commands running on the server (or possibly to use the server console via stdin). Realistically though, that’s overkill and you’ll be just fine backing up the files exactly as they are periodically.

    Kopia in particular will do well here because of its deduplication of baked up data + chunking algorithm that breaks up files. That has saved me a crazy amount of storage vs other solutions I’ve tried. Kopia level compression isn’t needed because the Minecraft region files themselves are already highly compressed.




  • I’ve been reading her book, the truancy thing is interesting. She had data that showed that kids that weren’t showing up at school, particularly young ones, didn’t learn how to read sufficiently well, and then fell behind in school and struggled to catch up, they then ended up struggling later in life, and often ending up either as victims or perpetrators of crime.

    So, she used the California DA’s office to enforce truancy laws across California, encouraged reaching out to fix the problems at home if at all possible, and also encouraged reaching out to folks that had been written off as “not caring” (she cites an example of a father that hadn’t been paying child support but upon learning that his daughter wasn’t going to school, started taking his daughter to school every morning, and volunteering in her classroom).

    Of course this is all by her account, but that sounds overall quite positive to me.


  • Sure, there’s a cost to breaking things up, all multiprocessing and multithreading comes at a cost. That said, in my evaluation, single for “unity builds” are garbage; sometimes a few files are used to get some multiprocessing back (… as the GitHub you mentioned references).

    They’re mostly a way to just minimize the amount of translation units so that you don’t have the “I changed a central header that all my files include and now I need to rebuild the world” (with a world that includes many many small translation units) problem (this is arguably worse on Windows because process spawning is more expensive).

    Unity builds as a whole are very very niche and you’re almost always better off doing a more targeted analysis of where your build (or often more importantly, incremental build) is expensive and making appropriate changes. Note that large C++ projects like llvm, chromium, etc do NOT use unity builds (almost certainly, because they are not more efficient in any sense).

    I’m not even sure how they got started, presumably they were mostly a way to get LTO without LTO. They’re absolutely awful for incremental builds.


  • Slow compared to what exactly…?

    The worst part about headers is needing to reprocess the whole header from scratch … but precompiled headers largely solve that (or just using smaller more targeted header files).

    Even in those cases there’s something to be said for the extreme parallelism in a C++ build. You give some of that up with modules for better code organization and in some cases it does help build times, but I’ve heard in others it hurts build times (a fair bit of that might just be inexperience with the feature/best practices and immature implementations, but alas).