by mh » Mon Aug 17, 2009 8:42 am
A few other things you can do is to keep the number of pools small and allocate to each in batches. I have a 256 K batch size hard-coded into here ("int newsize = MemoryRoundSize (vp->lowmark + size, 256);") but with a smaller number of memory pools that could be easily increased to 1 MB. What this will mean is that instead of committing memory on a regular basis it's only committed in discrete 1 MB chunks, meaning about 7 or 8 commits per map (as opposed to potentially several thousand if you just commit memory as required). A smaller number of pools means less batch size overhead; with 6 pools and a 256 K batch the most "wasted" memory you'll ever have is 1.25 MB.
My current incarnation uses only 6 pools - permanent, this game, cached models and sounds, this map, edicts and temp. Cached objects can be easily amalgamated with this game, but I want to keep it separate so that I can demand-flush it. You could cvar-ize the commit batch size if you wanted; I'm in 2 minds about this one, it's nice to expose it but should regular users be able to mess with something fairly low-level like that?
Precalculating sizes and allocating in bulk is always good too. For example, you can figure out in advance how many glpoly_t structs (and the sizes of their vertexes) you need for non warp-surfaces, so why not just allocate a single big buffer at the end of Mod_LoadFaces and write the polys into that instead of doing a separate allocation for each poly?
You'll notice that I have fixed hard-coded size limits on the pools here, but that's OK. warpc only needs 48 MB from the "this map" pool (or about 85 MB if you don't cache models and sounds) so a 128 MB upper limit is fine. You could increase it to 256 or even higher if you want; because it's only reserving an area of virtual memory for itself there's no actual resource utilization involved in doing so (just make sure that you don't reserve more than 2 GB total though!)
Anyway, I'm getting into areas more appropriate for the Engine Coding Tips thread now, so I think I'll leave it at that!