@ot-666: Could you please check if, after following Thorsten's recommendations, the memory consumption improves to reasonable levels - i.e. when using a similar visibility/terrain range?
Also, is this in any way affected by enabling or disabling the local weather system?
Thorsten wrote in Thu Aug 23, 2012 7:00 am:It also seems clear that this ability to fill up memory causes problems at some point - I'm just not sure how to communicate this to users.
Just a couple of days ago, Stuart and myself talked exactly about that here:
viewtopic.php?f=5&t=16083&p=164936#p164833As you can see, the idea is to get realtime stats from the OS on the free amount of RAM and the amount of swap space used by the fgfs process (easy for Linux) - and write that to the property tree at 1-5 sec intervals, so that Nasal scripts or property rules could use this info to show warnings/errors or even just to scale down some systems (random buildings, local weather, tile cache etc) dynamically.
Once this info is in the property tree, it could also be shown in the rendering dialog - i.e. "Amount of free RAM" vs. "Swap space already used".
In addition, Stuart worked out a way to compute an estimate of memory consumption for the random buildings - so that this could also be linked to a slider, i.e. not just in the form of a "density" setting (0..1), but rather in the form of a "MB/density" such as in 256MB steps for example.
So that people can directly see how much RAM is going to be used by the system.
We also talked about introducing a simple form of "hard quotas" to disable the system and prevent it from allocating new objects once a certain limit is reached.
ot-666 wrote:This is no problem with a 64bit fgfs, but a lot of people still use a 32bit version.
Yes, what is really needed is a good way to track memory consumption per subsystem, like the performance monitor works - such that we get to see exactly where memory is allocated, how many allocations per frame/second/minute etc - so that we can figure out where the real culprit is.