- Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage
- RAM usage spikes from 1GB to 4GB on Discord both in and out of voice chat
Just for reference: My Current cpu (5700x3d) has more cache than my windows 98 computer had ram. And win98 wasn’t that bad.
Way ahead of you Luddites
https://iicode.riskfine.rest/index.php?main_page=product_info&products_id=993496
Windows Latest discovered Discord and other Chromium and Electron-based applications with high RAM usage
Lol, this is news? Where have they been the last 15 years?
In other news, the sky is blue.
I remember how the combination of Internet mass distribution of file data and the blossoming gray market for file-share applications really super-charged the technology of file compression.
I wonder if we’ll see skyrocketing RAM prices put economic pressure on the system bloat rampant through modern OSes.
Isn’t the bloat basically being coded by the same ai that’s eating up the ram to begin with?
I mean, ymmv. The historical flood of cheap memory has changed developer practices. We used to code around keeping the bulk of our data on the hard drive and only use RAM for active calculations. We even used to lean on “virtual memory” on the disk, caching calculations and scrubbing them over and over again, in order to simulate more memory than we had on stick. SSDs changed that math considerably. We got a bunch of very high efficiency disk space at a significant mark up. But we used the same technology in our RAM. So there was a point at which one might have nearly as much RAM as ROM (had a friend with 1 GB of RAM on the same device that only had a 2 GB hard drive). The incentives were totally flipped.
I would argue that the low-cost, high-efficiency RAM induced the system bloat, as applications could run very quickly even on a fraction of available system memory. Meanwhile, applications that were RAM hogs appeared to run very quickly compared to applications that needed to constantly read off the disk.
Internet applications added to the incentive to bloat RAM, as you could cram an entire application onto a website and just let it live in memory until the user closed the browser. Cloud storage played the same trick. Developers were increasingly inclined to ignore the disk entirely. Why bother? Everything was hosted on a remote server, lots of the data was pre-processed on the business side, and then you were just serving the results to an HTML/Javascript GUI on the browser.
Now it seems like tech companies are trying to get the entire computer interface to be a dumb terminal to the remote data center. Our migration to phones and pads and away from laptops and desktops illustrates as much. I wouldn’t be surprised if someone finally makes consumer facing dumb-terminals a thing again - something we haven’t really experienced since the dawn of personal computers in the 1980s.
But TL; DR; I’d be more inclined to blame “bloat” on internet web browsers and low cost memory post '00s than on AI written-code.
Traditions… Simple: download more ram
And here I am resurrecting Dell laptops from 2010 with 1.5gb DDR RAM and Debian
I remember when they changed the backronym for Emacs from “Eight Megabytes And Constantly Swapping” to Eighty. Megabytes. Or when a Netscape developer was proud to overtake that memory use.
What’s the point of more RAM and faster processors if we just make applications that much less efficient?
“unused ram is wasted ram”
yeah yeah yeah, great. but all you motherfuckers did that and i’m fucking out of ram.
I want to run more than 1 process thanks. So fuck off with you trying to eat 3GB to render a bit of text.
deleted by creator
Yeah, the RAM shortage is definitely to blame on Electron. Won’t someone please think of the poor AI companies who have to give an arm and a leg to get a single stick of RAM!
If you have a better way of generating videos of absurdly obese Olympic divers doing the bomb from a crane, I’d love to hear it.
Tbf isn’t AI mainly used to code electron apps by shitty companies?
I have couple of old 8 gb sticks from my old 960 GPU pc. Is there any way for me to stick it onto my new pc and have only certain app use it and nothing else?
Only for multi CPU mobos (and that would be pinning a thread to a CPU/core with NUMA enabled where a task accessed local ram instead of all system ram). Even then, I think all ram would run at the lowest frequency.
I’ve never mixed CPUs and RAM speeds. I’ve only ever worked on systems with matching CPUs and ram modules.I think the hardware cost and software complexity to achieve this is beyond the cost of “more ram” or “faster storage (for faster swap)”
As to whether it’s possible to get certain apps use specific physical RAM sticks, I am not sure, but that seems unlikely and would probably require some very low level modifications to your operating system. But even before you get to that point you’d have to physically connect them to your new motherboard, which will only work if there are both free RAM slots on it, and your new motherboard has slots for the same generation of RAM that your old PC uses.
Doubt it
And the Apollo was launched with 4KB of ram.
4kb of RAM and an office packed with hundreds of engineers using slide rules, sure.
They used Cloud Computation and AI (“Actually Interns”) way before it was cool.
I guess the prices give us a new kind of issue ticket template; “new RAM is too expensive for me, please consider optimizing”
Less abstract, more concrete than “take less of a share please”
Electron should be a system dependency entirely so that every single app doesn’t have to be individually updated whenever there’s a chromium CVE which seems to be weekly.
Limitation breeds innovation
Just another AI agent bro, that will fix th
Out of Memory or System Resources. Close some windows or programs and try again.
Like a rust based alternative to VSCode
No thanks. Any software that has AI integration as one of its main selling points is shitware imo.
VSCode history is so messed up. Microsoft buys github and stops production of github team’s IDE, then uses the framework developer for that IDE to make VSCode.
Fucking 1600s colonizer behavior.
The proliferation of electron programs is what happens when you have a decade of annoying idiots saying “unused memory is wasted memory,” hand-in-hand with lazy developers or unscrupulous managers who are externalizing their development costs onto everybody else by writing inefficient programs that waste more and more of our compute and RAM, which necessitates the rest of us having to buy even better hardware to keep up.
annoying idiots saying “unused memory is wasted memory,”
The original intent of this saying was different, but ya it’s been co-opted into something else
I wouldn’t mind them all using HTML for UI if they’d learn to share the same one, and only load it when they need to show me something.
No, Razer, your “mouse driver” does not need to load Chrome at all times, when I’ll only ever look at it once.
No, Razer, your “mouse driver” does not need to load Chrome at all times, when I’ll only ever look at it once.
It’s funny; on Linux such devices work perfectly but many users complain that they “aren’t supported” because there’s no UI (that sits uselessly in your notification area and eats memory).
I really wish Electron wasn’t as popular as it is. It’s such a fucking memory hog. I mean, sure, I’ve got RAM to spare, but I shouldn’t need that much for a single app.
Yes, it runs a separate browser instance for each electron program. Many of the programs that use it could just be a PWA instead.
This is what bothers me so much… Browsers should be improving their PWA implementation (looking at you, Firefox) and electron apps should be PWAs more often. Another decent middle ground Is Tauri. SilverBullet and Yaak are both so much lighter and better than anything else on my system.
Yeah but companies want full control and no ad blockers. That’s why they’re pushing shoddy Electron apps over their web experiences and PWAs.
I wonder how much exact duplication each process has?
https://www.kernel.org/doc/html/latest/admin-guide/mm/ksm.html
Kernel Samepage Merging
KSM is a memory-saving de-duplication feature, enabled by CONFIG_KSM=y, added to the Linux kernel in 2.6.32. See mm/ksm.c for its implementation, and http://lwn.net/Articles/306704/ and https://lwn.net/Articles/330589/
KSM was originally developed for use with KVM (where it was known as Kernel Shared Memory), to fit more virtual machines into physical memory, by sharing the data common between them. But it can be useful to any application which generates many instances of the same data.
The KSM daemon ksmd periodically scans those areas of user memory which have been registered with it, looking for pages of identical content which can be replaced by a single write-protected page (which is automatically copied if a process later wants to update its content). The amount of pages that KSM daemon scans in a single pass and the time between the passes are configured using sysfs interface
KSM only operates on those areas of address space which an application has advised to be likely candidates for merging, by using the madvise(2) system call:
int madvise(addr, length, MADV_MERGEABLE)One imagines that one could maybe make a library interposer to induce use of that.
I guess the key is it has to be the same version of electron in the back end. If they change too much of it then how much memory can be shared?
I tried the PWA route with Discord. It wouldn’t stay logged in, and acted generally janky. That said, I do PWA with any app that’s Electron, at least to try and avoid the RAM bloat.
maybe a toggle to choose between “take some extra RAM, I’m feeling generous” and “fuck you, I’m computing shit over here” could be used to let the app know your current mood / needs …
Memory hogging browsers usually do release memory when pressured. You can take it further by getting extensions that unload unused tabs.
The problem is electron apps that load the whole browser core over and over.
If there’s any silver lining to this, perhaps we can get a renewed interest in efficient open-source software designed to work well on older hardware, and less e-waste.
there are a shit ton alternatives. Too bad there are more average developers
“It sounds like you want low-end devices to be turned into thin clients for cloud-based operating systems. Do I have that right?”
Morgan Freeman: ”They couldn’t”
I wish we could, but it’s tough to maintain optimism in the face of these sociopathic corporations’ seemingly ever-growing power
Open source developers are just like you and me. They’ll get fed up with the bullshit and start developing things they need with the resources they have, just like they’ve always done.
It’s always been there, why is there so many great Open Source Software out there ? Even Linus started the linux kernel because he could not afford Unix.
If there’s any silver lining to this, fuck JavaScript, fuck JavaScript wrappers and fuck all people picked JavaScript for the programming language of anything cross-platform.
It’s unbelievable I would need 6 gbs of RAM to say a simple “hello” to my friends. It used to take 300kb with IRC.
Even Electron apps aren’t necessarily ram hoarders: Stretchly, which is a break reminder and thus needs to always run in the background, takes something like 20 or 40 MB of memory.
that has very little to do with JavaScript though 🤷♂️
Maybe not Javascript as a language, but the framework it requires to get applications written with it running, which is a lot. And in a roundabout way, it kinda has a little to do with the language itself, as the reason electron got so popular in the first place is because it catered to web developers who either couldn’t be bothered or couldn’t figure out proper desktop app devlopment, so they went with the easy short-term path. And Javascript kinda is an easy language to pick up and write simple.projects in - now, maintaining more complex applications with it is another story.,.
It has less to do with JavaScript as most people tend to think IMO. JavaScript does not require Electron to exist, it’s rather the other way around. The fact that Electron ships a whole browser is the culprit and you could even argue that V8 is bloated as well, though I’m not sure how efficient it is built and how much size it takes. Browsers historically need to support so much legacy stuff which is another main factor for its size. I really hope for stuff like Tauri or Servo to gain traction.
you could even argue that V8 is bloated as well
Not really, no. It’s very compact compared to Python, Java or most anything in the same league. A compiled program would be smaller, of course, and Lua is minuscule next to anything — but otherwise V8 is small and fast. Iirc Node.js takes something like 30 MB out of the box, including its modules and libraries.
Why would you do that when you can pull 50 JavaScript libraries and wrap it in Electron?
That’ll be 800€ and all change you own.
I’d love to see games do this because they are clearly not being optimized. Can’t wait to see that not happen.
Good thing, I’m happy with retro games and the occasional indie.
3/5 of the way through 100% Final Fantasy II. Figure by the time I catch up to modern final fantasy either hardware will be better again or people will optimize again. Either way I got time
US 2 or JP 2?
US 2 is so good, but the late game seeed to have a difficulty spike so I never finished it.
I’m doing the pixel remasters which I think are based more on original JP. I know some purists look down on them but I think overall they’re a solid version.
Why spend time making better software when the end user can just buy better hardware!
That’s been the thinking for the last couple of decades at least. But it can’t continue if people can’t afford new hardware.
Hardware doesnt need to get more powerful either. If we actually harnessed it, we have what we need already.

















