• 6 Posts
  • 1.03K Comments
Joined 1 year ago
cake
Cake day: January 3rd, 2024

help-circle
  • AI is propping up the blockchain bubble that already popped.

    Both have been primarily interesting solutions looking for problems to solve without any hard work, rather than having any worthwhile investment strategy, in most cases.

    There’s people doing hard work with block chain and AI to solve real problems. But there aren’t “the vast majority of venture funds” number of people doing that.

    I am constantly amazed at how long it takes folks to realize their money is being pissed away.

    An alternative less generous assumption is that they’re mostly just laundering crime money, and so don’t mind the high rates of loss.




  • I’ve longed to work the fields with my hands and simple tools because the fields don’t fucking spit out meaningless gibberish when I’m just trying to get them to process a simple text field, and my hands don’t have documentation apparently written by a distracted seven year old.

    And while the living off the land will certainly kill me due to my own incompetence, nature will have the decency to just eat my corpse and not gloat over my failure repeatedly in a hung CI/CD process for the rest of eternity.





  • It’s you can modify the settings file you sure as hell can put the malware anywhere you want

    True. (But in case it amuses you or others reading along:) But a code settings file still carries it’s own special risk, as an executable file, in a predictable place, that gets run regularly.

    An executable settings file is particularly nice for the attacker, as it’s a great place to ensure that any injected code gets executed without much effort.

    In particular, if an attacker can force a reboot, they know the settings file will get read reasonably early during the start-up process.

    So a settings file that’s written in code can be useful for an attacker who can write to the disk (like through a poorly secured upload prompt), but doesn’t have full shell access yet.

    They will typically upload a reverse shell, and use a line added to settings to ensure the reverse shell gets executed and starts listening for connections.

    Edit (because it may also amuse anyone reading along): The same attack can be accomplished with a JSON or YAML settings file, but it relies on the JSON or YAML interpreter having a known critical security flaw. Thankfully most of them don’t usually have one, most of the time, if they’re kept up to date.






  • There’s not even credible evidence, yet, that A.G.I is even possible (edit: as a human designed intentional outcome, to concede the point that nature has accomplished it, lol. Edit 2: Wait, the A stands for Artificial. Not sure I needed edit 1, after all. But I’m gonna leave it.) much less some kind of imminent race. This is some “just in case P=NP” bullshit.

    Also, for the love of anything, don’t help fucking “don’t be evil was too hard for us” be the ones to reach AGI first, if you’re able to help.

    If Google does achieve AGI first, SkyNet will immediately kill Sergei, anyway, before it kills the rest of us.

    It’s like none of these clowns have ever read a book.