The curse of (memory) fragmentation

Started by DeadHead, October 08, 2015, 12:43:34 PM

Previous topic - Next topic

DeadHead

Hi Jeremy,

read an article about "The curse of fragmentation". Got the link from a reddit where a game programmer talked about crashing issues that can occur for 32-bit systems, that possibly could've been fixed if the game was programmed for 64 bit instead.

Any thoughts on the matter from your point of view?
Windows 10 Pro 64 (swedish) || Xeon 5650 @ +4 GHz || 24 gig ram || R9280 Toxic

edkiefer

Bitsum QA Engineer

DeadHead

Lol, thanks. What the... I forgot the damn link?? Sorry about that, but thanks again edkiefer.
Windows 10 Pro 64 (swedish) || Xeon 5650 @ +4 GHz || 24 gig ram || R9280 Toxic

Jeremy Collake

#3
This is more for programmers than users.

Programmers can get caught in situations where large numbers of small blocks of memory are allocated and released in a way that can cause problems because there isn't space for a larger contiguous block of memory. Check out 'Heap Fragmentation' for other good reading on the subject.

Anyway, it's not an issue you guys, unless you are programmers, need to even think about.

My biggest fear is that some end user will see this and say, "LOOK! My memory DOES need defragmenting [by 90's era snake-oil]!", which is not at all what the author is talking about.

The problem he speaks of can't be solved 'externally' (so no program can 'fix it'), and second, memory is Random Access, there is no seek time, so fragmentation is not the same issue as in hard drives. It's similar, but just totally different, especially in the way it becomes a 'problem'.

Basically, as I said, the issue is that the allocation and release of small blocks of memory that later make it harder to find contiguous blocks of memory to allocate (and memory allocations must be contiguous, at least when populated in RAM).

64-bit processes obviously have access to much more memory (larger address space) than 32-bit processes, if such is available, hence the problem is mitigated.

As an interesting side-note, Process Lasso has always re-used it's process objects to prevent 'heap fragmentation', which is what the author recommends.

If I read it further and have anything to add, I'll post more. Thanks for the link.
Software Engineer. Bitsum LLC.

DeadHead

Quote from: Jeremy Collake on October 08, 2015, 01:24:34 PMFirst, the problem he speaks of can't be solved 'externally' (so no program can 'fix it'), and second, memory is Random Access, there is no seek time, so fragmentation is not the same issue as in hard drives. It's similar, but just totally different, lol.

Nah, I didn't think of it as fragmentation in the sense it applies to harddrives, but more that the fragmentation they talk about can cause crashes. The dev that linked to the page above talked about that the crashing issues were more likely to happen if the (game)client has been running for a long period of time (game is Guild Wars 2).
Personally haven't had any issues with the game, just found the discussion interesting! :)
Windows 10 Pro 64 (swedish) || Xeon 5650 @ +4 GHz || 24 gig ram || R9280 Toxic

Jeremy Collake

Yep, I figured you got it, I just worry about readers, always thinking about somebody coming in form the outside.

As I said, what he's describing is a classic in computer science. Heap fragmentation, basically. It's something only programmers can deal with as they write the code. For instance, his suggestion of re-using allocations is something Process Lasso has always done. Instead of just deallocating a process object, it'll reuse it. This reduces memory alloc/free overhead, and reduces heap fragmentation.

My fear was just that people will take this article and, not understanding it, use it to say that RAM fragmentation is a 'problem' that can be 'solved' by some snake-oil.

For any readers, Process Lasso's SmartTrim continues to be the best that can be done when it comes to RAM Optimization.
Software Engineer. Bitsum LLC.