• locuester@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    NodeJS is worse. One dude just had to write a cli based JavaScript runtime and holy hell now entire backends run on the least performant runtime possible.

    • stormeuh@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Yeah, and all because god forbid you give your (future) employees time to learn another language besides JavaScript. Nope, line must go up so programming must be further commodified.

    • Ricaz@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      You can bash the Javascript language all you want, but don’t come for its performance lol. Nodejs was very fast across the board when it came out, and still beats most scripting languages. Even some bigger runtimes in IO.

      • locuester@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Its performance as a backend server is abysmal compared to standard compiled languages.

        It’s absolutely wasteful to use it.

        • Ricaz@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          The reality is that most backends don’t use compiled languages, but stuff like PHP, Java and Python.

          NodeJS scores very high on performance, concurrency, and especially IO, in that category.

          And calling it abysmal compared to compiled languages is not fair, but yes, there are much better alternatives.

  • tempest@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Bunch of people complaining about electron in this thread but I’m happy it exists.

    Without electron you would get way fewer Linux apps and often no GUI to go with them.

    The RAM usage is high sometimes but I have 128gb and unused RAM is wasted RAM. I don’t care how much something is using until it starts to swap or gets oom.

    • dogs0n@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Most people still only have 16gb of ram (like me).

      Electron is net good, but only for small teams that need to ship fast or solo devs etc who already know js and just want something to work.

      Billion dollar companies using it instead of paying more for native apps is a horrible use case (that’s mainly where my complaints live).

      At the very least, I hope we move to something that can use webviews on our system rather than bundling their own which would save on resources (but opens the possibility for version mismatches i guess, I dunno if you can “peg” that sorta stuff to a working version… but i guess thats just how browsers work so…).

  • WormFood@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    the problem isn’t electron, the problem is that A) html is the only truly cross platform UI framework and B) that html (and the web stack in general) has way too many features and is way too complex, because Google’s been bolting features onto it for decades.

  • masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    The alternative to Electron not existing is that you have slower developed, clunkier software, that’s buggier and has fewer features.

    There is no magic bullet of being like ‘just code the exact same thing in C’. There are tradeoffs to every development framework.

    • velindora@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      The happy medium is Tauri or Wails. No (less) bloat. People should stop using electron and google tech.

    • mfed1122@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Thank you for saying this. I’m seeing this thinking, have people used native apps recently? They’re not as great as people say. Have they tried coding a UI in a native library instead of the holy HTML CSS JS trifecta? It’s usually fairly miserable and usually extremely non-customizable by comparison.

      All this hating on Electron, hating on UE5, etc. really rubs me the wrong way. Firstly because people talk about optimization and “the good old days” while ignoring that we have completely different requirements these days. The new Witcher game isn’t fucking Quake. It’s gonna use some hardware. What do you want people to do? Implement custom rendering engines for every game? That’s the same as saying you want less games, because most teams literally cannot do that for various reasons, and the same applies to the Electron apps.

      Like, I get it. Things should be optimized. But I feel like “software is unoptimized now” is mostly a meme propagated by tech and gaming YouTubers who don’t really know what they’re talking about, through an audience of wannabes who don’t really know what they’re talking about. People whining about le yandere dev toothbrush!1!1! And le undertale dialogue if statements!1!1!. E.g I remember hearing people saying that because borderlands has a cel-shaded effect it should be cheaper to render - a completely wrong and backwards statement.

      It’s incredible how gamers think they understand rendering technology just because they play a lot of video games. And similarly I don’t like when developers (and probably a lot of non-developers) make a lot of assumptions about other people’s apps. See the complaints about Spotify memory usage. We don’t know anything about how Spotify works internally. There could be an algorithm running to determine which songs to queue up next which is analyzing multiple songs at once, or all sorts of other things. It’s so presumptuous to just look at an app in Task Manager and be like “pathetic, I could do better”, especially if it runs without problems on your device. And maybe it is built with Electron? So what? That just means that you’re paying some RAM in order to get an always updated UI that is matching what you get everywhere else. Like are we just gonna neglect that Electron provides a basically fully homogenous experience across all platforms with no extra code needed? We’re just gonna act like that’s worth nothing? It’s so entitled to say “nooooo I need you to spend an extra $2M/yr paying a Windows 8 UI dev team so that the Windows 8 Native App can have a full ten years of service and it can use 80 MB instead of 1 GB of RAM so that way I can also use this app and 200 other glorious native apps all simultaneously but also I don’t want to pay any more for the product and I don’t care if you’re a solo developer because back in my day solo developers authored papers about their custom algorithms and if you don’t do that but with my new 100x more demanding requirements you’re trash”.

      • paequ2@lemmy.today
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        Have they tried coding a UI in a native library instead of the holy HTML CSS JS trifecta? It’s usually fairly miserable and usually extremely non-customizable by comparison.

        🙋‍♂️ I have. Exactly because Electron = bloat. Granted it was just a small side project that I spent like a month or so building. I wanted to learn GTK4, Adwaita, GNOME Blueprints, and Vala.

        I personally didn’t think it was too miserable (again small project, not a ton of specialized needs). However, I 10000% completely agree with the “extremely non-customizable by comparison”. I can totally see why companies don’t want to look like a generic OS app. Getting the Bitwarden app to look like Bitwarden on Linux seems like it would be waaay harder and more time consuming than just reusing their existing HTML, CSS, and JS codebase. At least in my month of messing with GTK, it seems like desktop UIs have wwwwaaaaayyyyyyy less control over the UI than webapps do, at least by default. I’m guessing you can write more Vala to get a more custom UI in GTK, but again seems like waaaaayy more work for something highly custom.

        By the end, I thought: Electron = bloat, but also Electron = apps existing at all.

    • bluGill@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      until curl rewrites in electon and you don’t have enough ram to run it anymore

    • alk@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      back in the day people would download more ram and put it on giant tape-based backup systems. Big companies started downloading massive amounts of high quality ram this way. This created a ram shortage, and companies like corsair are now using their massive reserves of downloaded ram and filling empty ram sticks with them and making lots of money. That’s why ram is so expensive today. Any ram you can download today is low quality ram, and the only high quality ram can be had on physical sticks, which were filled by the companies with ram reserves. 1969 was the peak of the ram harvesting, so you’ll probably get some really great ram if it came from that year.

  • tangonov@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Meanwhile my Linux runtime still boots for 1G and Emacs is looking pretty good right now lol

  • Digit@lemmy.wtf
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    LOL.

    Enjoyable little irony personally for me, as when I asked an LLM to churn out a readme for my fin project ( posted on lemmy here ), it proposed one of fin’s advantages that could push fin into the notable category: “Performance niche - might be faster than Electron-based editors for simple tasks”. Maybe I should change that to “save a fortune on RAM compared to Electron-based editors”.

    • Hudell@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Same, though I’ve started having some issues with their slower updates not catching up to changes on OSs and stuff (using it on an atomic distro for example is quite a pain).

  • boonhet@sopuli.xyz
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Atom was kinda revolutionary in its plugin support and everything IIRC.

    Well, now that Atom has been replaced by VSCode, which is also an electron app, the original Atom devs, or at least some of them, are creating Zed. Zed’s written in Rust and uses a lot less memory.

    Of course it’s not yet as mature and they’re trying to earn money by integrating AI and selling that as a service. BUT the AI is voluntary and even if you do want to use it, you don’t have to pay to use their AI (which comes with a free tier if you DO want to use it), you can literally run your own model in ollama.

    It’s not perfect, but I love how little RAM it uses compared to VSCode and (shudders) the Jetbrains suite (which I normally love, but hate the RAM and CPU usage, it can drive my computer pretty slow)

    • foo@feddit.uk
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      They also developed their own Rust UI library and open-sourced it.

    • dreadbeef@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      still have the patch they sent for people who published packages. I made a theme no one but me used but still! Pre microsoft github was cool

      • Calyhre@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Got that patch still in it’s brown envelope somewhere in a drawer, for doing a syntax highlighting plugin.

        They were indeed cool

    • PoliteDudeInTheMood@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      That explains alot. I have both PyCharm and RustRover open as I steal convert stuff from a project I found. Anywho I was typing in discord and I was typing faster than it rendered and I thought that was strange

      • The Quuuuuill@slrpnk.net
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        it did, but this is about electron, which isn’t relevant to sublime. sublime’s plugins mechanism is a little different from atom, which is much more like emacs

        • jol@discuss.tchncs.de
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          Yes, but the plugin ecosystem really was pioneered by sublime and then ported over everywhere. A big reason atom was so successful is the plugin and themes were compatible.

    • NickeeCoco@piefed.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      It has become my favorite editor, even though I don’t need or want the AI stuff. They do something that I do quite appreciate, that I wish other apps (looking at you, Firefox) would do:

      sroAL9YDNF05i6p.png

      In the AI section of the settings, the first thing is a toggle that turns off all AI features.

      • x00z@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        It shouldn’t have AI features by default though. Just make that functionality a plugin that can be downloaded separately.

    • MeThisGuy@feddit.nl
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      i doo doo love it too.
      does it have syntax support for Gcode yet? I do CnC (not the kinky kind) and I love to see shit in color. there’s only a few specialized editors that I have come across that do this reasonably well…

      • Buddahriffic@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Iirc you can create custom syntax highlighting formats for notepad++. So if it’s not there by default, someone else might have made a file for it, or you can start making one yourself, as the format was easy to understand. It’s been like a decade since I’ve used it, but it should be somewhere in the menus.

  • zorro@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    rust_analyzer takes 16GB of ram though so good luck actually working on a rust project

    (Semi kidding, the project I work on is very big)

    • finitebanjo@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Honestly I can’t imagine a project getting that size unless there are no placeholder assets for visuals. Or if it uses a buttload of local libraries.

  • 1984@lemmy.today
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Linux wins again. Still runs on same hardware as 10 years ago. :) No forced updates by any big corp.

    • gerryflap@feddit.nl
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      In terms of performance yeah. Though not every old device keeps working. You’re still vulnerable to driver support for newer kernels. My old Thinkpad no longer functions properly because the Nvidia drivers are not compatible with newer kernels. I can either have an unsafe machine that runs fine or an up-to-date machine that can barely open a web browser.

        • Digit@lemmy.wtf
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I struck lucky. Never had any issues with nvidia on linux in all my >2 decades using linux.

          Still prefer AMD though. Straight through.

  • Ex Nummis@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    If there’s any upside to the entire situation, it’s that perhaps, maybe, developers will again start paying more attention to optimization instead of just throwing more powerful hardware at it.

    Some of the greatest games ever developed for consoles were great because the developers had to get extremely creative with the limited resources at their disposal. This led to some incredibly optimized games that could do a whole lot with those very limited resources.

    • masterspace@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      The upside to the situation is that electron has been a more successful cross platform development framework then literally anything that came before it, from Xamarin to Java. And it’s entirely based on open source software, and open web standards.

    • hornywarthogfart@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      I think pretty much every dev understands the issue but they are limited in what they can do about it. Quitting a job because they won’t let you optimize is noble but unrealistic for the vast majority of devs.

      I would love for optimizations to start being prioritized. More specifically, I would love to see vendors place limits on memory use in apps. For example, Steam could reject any game over 50gb. I do not believe for a moment that any game we currently have needs more than 50gb except maybe an mmo with 20 years of content. Or Microsoft could reject apps that use more than X ram. They won’t ever do that but without an outright rejection, this won’t be fixed.

    • ulterno@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I always care about how much memory I end up using.
      Problem is, most places won’t pay for caring about that. Those that would, are doing so because they are using the product on their own systems instead of some customer’s systems.

      • ZILtoid1991@lemmy.worldOP
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I think we will first see a batch of alternative apps, which either will get shut down by manufacturers etc., or get tolerated as an alternative.

        • ulterno@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I’m not sure I know many Electron apps that are worth running.
          There is WhatsApp, but I just run the browser version. For Matrix, there’s NeoChat, which uses QML and is definitely better than Electron.

            • ulterno@programming.dev
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 months ago

              android-studio : I guess that explains why it ran so badly back when I had to use it for work.
              jdk wouldn’t be an Electron app, right?

              discord is the only 1 of those that I used in any meaningful sense before and I already stopped using it for reasons other than Electron. So, I guess it’s just a personal thing that I don’t tend to require stuff that is made in Electron.

              • BootLoop@sh.itjust.works
                link
                fedilink
                arrow-up
                0
                ·
                2 months ago

                I believe Android Studio is built on top of IntelliJ IDE which uses Java, so no Electron. That being said, Java applications are generally RAM heavy as well and Android Studio was always a pig on resources.

                Visual Studio Code (not Visual Studio!) is Electron based but I’ve always had good performance with it.

                • ulterno@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 months ago

                  Visual Studio Code

                  Yeah, that’s one that I can’t talk badly about.
                  While I have used MS Visual Studio and know how slow it was, I tried VS Codium once or twice and it worked pretty smoothly. Someone probably put quite a bit of effort into making it so.

                  Apart from Android Studio, which ended up not even starting up properly on the work computer, Gradle itself also takes quite a bit of time and resources. I was using the NDK with a C++ project and it took way longer to setup than any BSP, despite only being able to compile for a single version of Android.

    • Jesus_666@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Best I can do is mandatory Lumen and Nanite. You can get almost-stable 60 fps on a 5090 with DLSS Performance and 3x frame gen, which should be optimized enough for anyone.

      My game will sell for 80 bucks, 150 if you want the edition with all the preorder-exclusive content.

    • BootLoop@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      You don’t even need to go that far back. It blows my mind that the 360 and PS3 have 512mb of RAM. Halo 4, GTA 5, and The Last of Us did some impressive graphics work with 512mb.

      • dogs0n@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 months ago

        Oh wow my mind is blown. Even more so that it’s 256mb of DRAM and 256mb of VRAM separately.

        We have really gone down hill and fast ;(

        In my brain memory I find it hard to believe all the textures loaded at one time could ever be so small. Im amazed.

      • I Cast Fist@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        tbf, the PC version of console games of the time ran like utter shit on computers with less than 2GB RAM and graphics cards worse than a Geforce 9800. A lot of people were still on WinXP, which was bloated compared to WinME-2000, but by 2006 it was fine.

    • Digit@lemmy.wtf
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      If there’s any upside to the entire situation, it’s that perhaps, maybe, developers will again start paying more attention to optimization instead of just throwing more powerful hardware at it.

      Amen.

      Long time irked that so many developers fail with the mathematics of the situation.

      If hardware multiplies its resources 1000x, that does not mean you can make your program use 1000x resources, along with thousands of other developers failing at that mathematics too, making bloat radically outpace Moore’s Law.

      If hardware multiplies its resources 1000x, that should mean that developers continue to keep their software tight, lean, and fast, and that should mean users have 1000x more resources available to do more with.

      *Dreamer*