• Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage
  • RAM usage spikes from 1GB to 4GB on Discord both in and out of voice chat
      • baconsunday@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        22 minutes ago

        Correct! The difference is the OS.

        Windows is a ram hog. Using 4GB or more just to exist. Linux uses 1-2GB, sometimes less.

        Microsoft FORCES electron web components.

        Linux has choice.

        So yes, linux has electron as well, but Linux is a lot lighter and nowhere near a hog like windows.

    • The Quuuuuill@slrpnk.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 hours ago

      what’s google got to do with it? this is an article about a product develeped at GitHub (now a microsoft subsidiary) causing problems with Windows and the thumbnail is showing produts from the following companies:

      • facebook
      • discord
      • microsoft
      • microsoft
      • microsoft
      • microsoft

      like. look. i hate google. they partner with israel to conduct genocide (don’t use waze, btw, or better yet, don’t use any google products). but this seems like not looking at the whole of how evil all of big tech is just to focus on how evil one company in big tech is

      • rdri@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        2 minutes ago

        The article mentions Chrome/Chromium: 9 times
        The article mentions Google: 0 times

        Google made Chrome. Chrome had that multi-process architecture at its core which allowed to consume as much memory as needed even on 32-bit OS. Chromium was always inside it and open source. Then they created CEF, which allowed webdevs to build “real” apps, and that opened the floodgates. Electron was first built on it but they wanted to include Node and couldn’t because it required too much experience in actual coding. So they switched to Chromium. It didn’t change much in the structure, just basically invited more webdevs to build more “real” apps (at 1.0 release Electron advertised hundreds of apps built with it on its website).

        Google could do something about how the web engine works in frameworks (that don’t need that much actual web functionality), but didn’t. They invited webdevs to do anything they want. Webdevs didn’t care about security because mighty Google would just publish new Chromium update eventually. They never realized they don’t need more security in their local “real” apps gui that connect to their websites because there is not much room for security danger in such scenarios. They just always updated the underlying engine because why not. Chromium dll is now at 300 mb or something? All of that code is much needed by everyone, is it not?

        So, for me the sequence was always seen as this:

        Google (caring about webdevs, not OS) ->

        Webdevs (not caring about native code and wanting to sell their startup websites by building apps) ->

        Reckless web development becoming a norm for desktop apps ->

        Corporations not seeing problems with the above (e.g. Microsoft embedding more stuff with WebView2 aka Chromium)

        So yes, Google has everything to do with it because it provided all the bad instruments to all the wrong people.

        Personally, I don’t care much about hating Microsoft anymore because its products are dead to me and I can only see my future PCs using Linux.

      • Turret3857@infosec.pub
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 hours ago

        CoMaps is a good alternative to Waze. If you think it isnt make an OSM account and help make it a good alternative :p

    • zaphod@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      ·
      14 hours ago

      Electron was originally developed by GitHub for a text editor called Atom.

  • xthexder@l.sw0.com
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    2
    ·
    1 day ago

    Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage

    Lol, this is news? Where have they been the last 15 years?

    In other news, the sky is blue.

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    ·
    1 day ago

    I remember how the combination of Internet mass distribution of file data and the blossoming gray market for file-share applications really super-charged the technology of file compression.

    I wonder if we’ll see skyrocketing RAM prices put economic pressure on the system bloat rampant through modern OSes.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 day ago

        I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.

        I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.

        Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.

        Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.

        But TL; DR; I’d be more inclined to blame “bloat” on internet web browsers and low cost memory post '00s than on AI written-code.

        • nosuchanon@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 hours ago

          I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.

          I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.

          Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.

          Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.

          It is definitely coming and fast. This was always Microsoft’s plan for an internet only windows/office platform. Onedrive and 365 is basically that implementation now that we have widespread high speed internet.

          And with the amount of SaaS apps the only thing you need on a local machine is some configuration files and maybe a downloads folder.

          Look at the new Nintendo Switch cartridges as an example. They don’t contain the game, just a license key. The install is all done over the internet.

  • kalpol@lemmy.ca
    link
    fedilink
    English
    arrow-up
    20
    ·
    1 day ago

    And here I am resurrecting Dell laptops from 2010 with 1.5gb DDR RAM and Debian

    • mcv@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 day ago

      I remember when they changed the backronym for Emacs from “Eight Megabytes And Constantly Swapping” to Eighty. Megabytes. Or when a Netscape developer was proud to overtake that memory use.

      What’s the point of more RAM and faster processors if we just make applications that much less efficient?

      • The Quuuuuill@slrpnk.net
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        “unused ram is wasted ram”

        yeah yeah yeah, great. but all you motherfuckers did that and i’m fucking out of ram.

        • pftbest@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          15 hours ago

          This phrase is just plain wrong. Unused ram is used for the page cache by the kernel. You must always have some ram free or else the whole system will not operate without a page cache. Larger page cache allows to cache more files from the file system.

        • Korhaka@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          I want to run more than 1 process thanks. So fuck off with you trying to eat 3GB to render a bit of text.

  • BlueBockser@programming.dev
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    1 day ago

    Yeah, the RAM shortage is definitely to blame on Electron. Won’t someone please think of the poor AI companies who have to give an arm and a leg to get a single stick of RAM!

    • floofloof@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      16 hours ago

      I wouldn’t mind so much if they were giving their own arms and legs, but they seem to be giving ours.

    • HugeNerd@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      21 hours ago

      If you have a better way of generating videos of absurdly obese Olympic divers doing the bomb from a crane, I’d love to hear it.

  • Kissaki@feddit.org
    link
    fedilink
    English
    arrow-up
    37
    ·
    edit-2
    2 days ago

    I guess the prices give us a new kind of issue ticket template; “new RAM is too expensive for me, please consider optimizing”

    Less abstract, more concrete than “take less of a share please”

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      1 day ago

      Electron should be a system dependency entirely so that every single app doesn’t have to be individually updated whenever there’s a chromium CVE which seems to be weekly.

      • Kissaki@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        We kinda have that already

        Some frameworks/standard libs do support that, making use of OS webrendering capabilities.

        For example MAUI WebView

        WebView uses different browser engines on each platform to render web content:

        • Windows: Uses WebView2, which is based on the Microsoft Edge (Chromium) browser engine. This provides modern web standards support and consistent behavior with the Edge browser.
        • Android: Uses android.webkit.WebView, which is based on the Chromium browser engine. The specific version depends on the Android WebView system component installed on the device.
        • iOS and Mac Catalyst: Uses WKWebView, which is based on the Safari WebKit browser engine. This is the same engine used by the Safari browser on iOS and macOS.
  • floofloof@lemmy.ca
    link
    fedilink
    English
    arrow-up
    165
    ·
    edit-2
    2 days ago

    If there’s any silver lining to this, perhaps we can get a renewed interest in efficient open-source software designed to work well on older hardware, and less e-waste.

    • DominusOfMegadeus@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      77
      ·
      edit-2
      2 days ago

      Morgan Freeman: ”They couldn’t”

      I wish we could, but it’s tough to maintain optimism in the face of these sociopathic corporations’ seemingly ever-growing power

      • cenzorrll@piefed.ca
        link
        fedilink
        English
        arrow-up
        30
        ·
        2 days ago

        Open source developers are just like you and me. They’ll get fed up with the bullshit and start developing things they need with the resources they have, just like they’ve always done.

        • VieuxQueb@lemmy.ca
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 day ago

          It’s always been there, why is there so many great Open Source Software out there ? Even Linus started the linux kernel because he could not afford Unix.

    • XLE@piefed.social
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 day ago

      “It sounds like you want low-end devices to be turned into thin clients for cloud-based operating systems. Do I have that right?”

    • lostbit@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      21 hours ago

      there are a shit ton alternatives. Too bad there are more average developers

    • Ugurcan@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      7
      ·
      edit-2
      2 days ago

      If there’s any silver lining to this, fuck JavaScript, fuck JavaScript wrappers and fuck all people picked JavaScript for the programming language of anything cross-platform.

      It’s unbelievable I would need 6 gbs of RAM to say a simple “hello” to my friends. It used to take 300kb with IRC.

        • fernandofig@reddthat.com
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          1
          ·
          1 day ago

          Maybe not Javascript as a language, but the framework it requires to get applications written with it running, which is a lot. And in a roundabout way, it kinda has a little to do with the language itself, as the reason electron got so popular in the first place is because it catered to web developers who either couldn’t be bothered or couldn’t figure out proper desktop app devlopment, so they went with the easy short-term path. And Javascript kinda is an easy language to pick up and write simple.projects in - now, maintaining more complex applications with it is another story.,.

          • Nenutzerbame@feddit.org
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 day ago

            It has less to do with JavaScript as most people tend to think IMO. JavaScript does not require Electron to exist, it’s rather the other way around. The fact that Electron ships a whole browser is the culprit and you could even argue that V8 is bloated as well, though I’m not sure how efficient it is built and how much size it takes. Browsers historically need to support so much legacy stuff which is another main factor for its size. I really hope for stuff like Tauri or Servo to gain traction.

            • [object Object]@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              1 day ago

              you could even argue that V8 is bloated as well

              Not really, no. It’s very compact compared to Python, Java or most anything in the same league. A compiled program would be smaller, of course, and Lua is minuscule next to anything — but otherwise V8 is small and fast. Iirc Node.js takes something like 30 MB out of the box, including its modules and libraries.

      • [object Object]@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Even Electron apps aren’t necessarily ram hoarders: Stretchly, which is a break reminder and thus needs to always run in the background, takes something like 20 or 40 MB of memory.

    • tekato@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      2 days ago

      Why would you do that when you can pull 50 JavaScript libraries and wrap it in Electron?

    • neon_nova@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      3
      ·
      2 days ago

      I’d love to see games do this because they are clearly not being optimized. Can’t wait to see that not happen.

      Good thing, I’m happy with retro games and the occasional indie.

      • mushroommunk@lemmy.today
        link
        fedilink
        English
        arrow-up
        9
        ·
        2 days ago

        3/5 of the way through 100% Final Fantasy II. Figure by the time I catch up to modern final fantasy either hardware will be better again or people will optimize again. Either way I got time

          • mushroommunk@lemmy.today
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            I’m doing the pixel remasters which I think are based more on original JP. I know some purists look down on them but I think overall they’re a solid version.

            • neon_nova@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              17 hours ago

              I haven’t played them myself, but I think they look cool.

              If someone wants to be a purist, let them get an original system and a crt. Otherwise, they can just shut up about.

    • warm@kbin.earth
      link
      fedilink
      arrow-up
      5
      ·
      2 days ago

      Why spend time making better software when the end user can just buy better hardware!

      • floofloof@lemmy.ca
        link
        fedilink
        English
        arrow-up
        7
        ·
        2 days ago

        That’s been the thinking for the last couple of decades at least. But it can’t continue if people can’t afford new hardware.

        • warm@kbin.earth
          link
          fedilink
          arrow-up
          3
          ·
          1 day ago

          Hardware doesnt need to get more powerful either. If we actually harnessed it, we have what we need already.