Heap Fragmentation

Mike Shannon mikes at Apple.COM
Thu Sep 28 09:49:28 AEST 1989


In article <1989Sep27.045926.12634 at polyslo.CalPoly.EDU> ttwang at polyslo.CalPoly.EDU (Thomas Wang) writes:
>Is heap fragmentation in C a problem or non-problem?

Memory management is not part of the C language, it is part of the library
support in the underlying system.  When I last heard, there is an
effective scheme for
re-using de-allocated memory, which uses hashing.  It's my understanding that
de-allocated memory chunks are never handed back to the operating system,
but are kept in an array of queues where the chunk size doubles from one
queue to the next.  So, when you do a malloc(xxx), the queue which has a block
bigger than or equal to xxx in size is searched, and a block is broken in two
and you get part of it.  This means you don't get the overhead of a system
call.

	I personally have never had a problem with heap fragmentation under
UNIX.
	For performance reasons, if I am going to be allocating and
de-allocating memory of a single fixed size, I often maintain my own 'free'
lists and do my own allocation which checks my free list first before calling
malloc().
	In summary, I would say heap fragementation under UNIX is not a
problem.
-- 
			Michael Shannon {apple!mikes, mikes at apple.com}



More information about the Comp.lang.c mailing list