• over_clox@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 hours ago

      No, I learned programming back when programmers actually worked in binary and sexadecimal (Ok IBM fanboys, they call that hexadecimal now, since IBM doesn’t like sex).

      I still use the old measurement system, save for the rare occasions I gotta convert for the average layman terms.

      It tells a lot really quick when talking to someone else, when they don’t understand why 2^10 (1024) is the underlying standard that the CPU likes.

      Oh wait, there’s a 10 in (2^10)…

      Wonder where that came from?.. 🤔

      I dunno, but bit shift binary multiplications and divisions are super fast in the integer realm, but get dogshit slow when performed in the decimal realm.

        • over_clox@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 hours ago

          But if you fall into the folly of decimal on a device inherently meant to process binary, then you might allocate an array of 1000 items, rather than the natural binary of 1024, leading to a chance of a memory overflow…

          Like, sell by the 1000, but program by the 1024.