Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Indeed, "free memory" is the biggest waste of money and energy imaginable. Unless you lack the memory necessary to do something else, you shouldn't care.

If the question really needs answering, the poster should just get a heap profile. Some instructions are here: http://goog-perftools.sourceforge.net/doc/heap_profiler.html

You may also wish to try the memlog flag to chrome, or rebuild chrome with TCMALLOC_SMALL_BUT_SLOW defined. Warning: it does what it says.



Unless you lack the memory necessary to do something else, you shouldn't care.

This is exactly the attitude that leads to such problems. Applications that assume they can use all of the available memory on the machine are hostile to multitasking environments.


Applications should use all available RAM they practically can for things like cache. Ideally they will hold on to it lightly so other applications can take what they need.

Entirely unused RAM is a waste if never used.


This is built into any operating system. If an application frees up memory, the OS doesn't throw that data out, it just marks it as "can be overwritten" and ensures that it actually can be overwritten without data loss (so, if the data in RAM has been changed, it writes back these changes to the hard drive).

Yes, you can actually manage RAM more efficiently for your application, if you just reserve a whole bunch of it and work inside of that, but at best you'll trick your users into thinking your applications is fast, but it's not actually in the interest of the user. If they wanted to dedicate all RAM to your application, they would not have opened a second application.


The OS should use all available RAM it practically can for things like cache.

Not the applications.


That may not actually be a practical approach to the problem. Years (and jobs) ago, we rolled significant caching into Vegas (video editor), allowing the app to take up 8GB of RAM in cached frames, even on 32-bit builds. There was no way for the OS to do this sort of work for us. The best that the operating system would have been able to do would have been to cache file reads for us.

Caching the output of computation was incredibly valuable to our customers, and the caching that the OS would have done would have had little to no benefit to our customers.

The same could be said for Chrome. Maybe there is value to a globally coordinated OS caching manager, but the OS shouldn't have sole responsibility for caching. Such an approach is clearly suboptimal for the user in at least one use-case (see example above).


There are exceptions, but in almost all cases, an application hacking the OS' RAM management is slowing down other applications disproportionally in order to make itself seem faster.

Shaming such RAM hogging applications is not just legitimate, it's very much necessary. Otherwise all applications would just reserve as much RAM as they can convince the OS to give them, even if they have no real use for it. There is no checks for this in place, so users need to understand the technical implications and punish RAM hoggers by avoiding them.

If your application can utilize RAM to disproportionally speed itself up and therefore on average speed up the workflow of most users, then that's legitimate, too, and an informed user will see the value.


You're talking about a professional software suite meant to work on workstations often solely meant to run that suite.

Chrome has to cooperatively share resources with other applications and with aggressive caching at application level, the only strategy at hands of the OS is to start swapping.


I'm responding to the assertion: "The OS should use all available RAM it practically can for things like cache. Not the applications."

Applied absolutely, it results in deeply sub-optimal behavior in a large number of cases.

Another case could be image decode caching via glide for scrolling lists, or skia output tile caching in a browser, or texture caching in a game engine, or reference frame caching in a video decoder, or glyph caching in a word processor, or block caching in a constructive solid modeller, or result caching in a spreadsheet, or composition caching in a presentation tool, etc. etc. etc.

Caching is an enormously effective tool for applications and operating systems. If an operating system removed it as a tool for applications, it wouldn't likely be a competitive operating system.

Should applications abusively and single-mindedly monopolize memory usage? In some cases, maybe. In general, I think we'd tend to be on the same side of things. I like to see well-behaved applications (that make conservative and efficient use of system resources) and robustly managed operating systems (that don't let apps walk all over them).


IIRC chrome actively adjusts its memory usage based on available memory.

If you have very little, it will turn off many optimizations, turn on features that reduce memory (like tab suspension built into the browser), and overall adjust the whole browser to use less resources but go slower.


> Unless you lack the memory necessary to do something else

This is the case. 8 GB of memory, VM takes half of it, Chrome will happily take the rest with few tabs open and lock up my computer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: