Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It used to be like that, computer had limited resources and desktop environments were light. Then at some point RAM became less and less of an issue, and everything started to get bigger and less efficient.

Coyuld anyone summarize why a desktop Windows/MacOs now needs so much more RAM than in the past? is it the UI animations, color themes, shades etc etc or is it the underlying operating system that has more and more features, services etc etc ?

I believe it's the desktop environment that is greedy, because one can easily run a linux server on a raspberry pi with very limited RAM, but is it really the case?





> Coyuld anyone summarize why a desktop Windows/MacOs now needs so much more RAM than in the past

Just a single retina screen buffer, assuming something like 2500 by 2500 pixels, 4 byte per pixel is already 25MB for a single buffer. Then you want double buffering, but also a per-window buffer since you don't want to force rewrites 60x per second and we want to drag windows around while showing contents not a wireframe. As you can see just that adds up quickly. And that's just the draw buffers. Not mentioning all the different fonts that are simultaneously used, images that are shown, etc.

(Of course, screen bufferes are typically stored in VRAM once drawn. But you need to drawn first, which is at least in part on the CPU)


Per window double buffering is actively harmful - as it means you're triple buffering, as the render goes window buffer->composite buffer->screen, and that's with perfect timing, and even this kind of latency is actively unpleasant when typing or moving the mouse.

If you get the timing right, there should be no need for double-buffering individual windows.


You don't need to do all of this, though. You could just do arbitrary rendering using GPU compute, and only store a highly-compressed representation on the CPU.

Yes, but then the GPU needs that amount of ram, so it's fairer to look at the sum of RAM + VRAM requirements. With compressed representations you trade CPU cycles for RAM. To save laptop battery better required copious amounts of RAM (since it's cheap).

The web browser is the biggest RAM hog these days as far as low-end usage goes. The browsing UI/chrome itself can take in the many hundred megs to render, and that's before even loading any website. It's becoming hard to browse even very "light" sites like Wikipedia on less than a 4GB system at a bare minimum.

> is it the UI animations, color themes, shades etc etc or is it the underlying operating system that has more and more features, services etc etc ?

...all of those and more? New software is only optimized until it is not outright annoying to use on current hardware, it's always been like that and that's why there are old jokes like:

    "What Andy giveth, Bill taketh away."

    "Software is like a gas, it expands to consume all available hardware resources."

    "Software gets slower faster than hardware gets faster"
...etc..etc... variations of those "laws" are as old as computing.

Sometimes there are short periods where the hardware pulls a little bit ahead for a few short years of bliss (for instance the ARM Macs), but the software quickly catches up and soon everything feels as slow as always (or worse).

That also means that the easiest way to a slick computing experience is to run old software on new hardware ;)


Indeed. Much of a modern Linux desktop e.g. runs inside one of multiple not very well optimized JS engines: Gnome uses JS for various desktop interactions, and all major desktops run a different JS engine as a different user to evaluate polkit authorizations (so exactly zero RAM could be shared between those engines, even if they were identical, which they aren't), and then half your interactions with GUI tools happens inside browser engines, either directly in a browser, or indirectly with Electron. (And typically, each Electron tool bundles their own slightly different version of Electron, so even if they all run under the same user, each is fully independent.)

Or you can ignore all that nonsense and run openbox and native tools.


A month with CrunchBang Plus Plus (which is a really nice distribution based on Openbox) and you'll appreciate how quick and well put together Openbox and text based config files are.

Which is baffling as to why they chose it - I remember there being memory leaks because GObject uses a reference counted model - cycles from GObject to JS then back were impossible to collect.

They did hack around this with heuristics, but they never did solve the issue.

They should've stuck with a reference counted scripting language like Lua, which has strong support for embedding.


I've found that Gnome works about as well as other "lighter" desktop environments on some hardware I have that is about 15 years old. I don't think it using a JS engine really impacts performance as much as people claim. Memory usage might be a bit higher, but the main memory hog on a machine these days is your web browser.

I have plenty of complaints about gnome (not being able to set a solid colour as a background colour is really dumb IMO), but it seems to work quite well IME.

> Or you can ignore all that nonsense and run openbox and native tools.

I remember mucking about with OpenBox and similar WMs back in the early 2000s and I wouldn't want to go back to using them. I find Gnome tends to expose me to less nonsense.

There is nothing specifically wrong with Wayland either. I am running it on Debian 13 and I am running a triple monitor setup without. Display scaling works properly on Wayland (it doesn't on X11).


COSMIC is gaining ground as a JS-free alternative to current desktops, so hopefully you won't be limited to openbox and such.

Openbox isn't limiting me, Wayland still has no advantages for what I do with desktops.

As written on a sibling comment, maybe RAM being hard to get will bring some of that back.

I really needed to save to buy RAM sticks back in the day.


I am wondering if, with memory and storage prices skyrocketing, there will be more effort on making computing use less resources?

Unlikely. If you can't afford RAM, how can you afford the SaaS contracts that keep devs employed?

They typically also need GPU acceleration, these days, and that can be an even bigger bottleneck, with the drivers often not supporting older cards.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: