Heap Fragmentation

Tom Stockfisch tps at chem.ucsd.edu
Sat Sep 30 15:55:11 AEST 1989


In article <11161 at smoke.BRL.MIL> gwyn at brl.arpa (Doug Gwyn) writes:

>>In article <1989Sep27.045926.12634 at polyslo.CalPoly.EDU> ttwang at polyslo.CalPoly.EDU (Thomas Wang) writes:
>>>Is heap fragmentation in C a problem or non-problem?

>>In summary, I would say heap fragementation under UNIX is not a problem.

>It can be, but unless you're pushing the limits on maximum virtual
>data space size it would take a rather unusual memory allocator usage
>pattern.

I would disagree, at least with the Berkeley malloc package.  It can easily
waste a factor of 4 or 5.  The following excerpt is certainly not
unusual, yet malloc gets 45984 bytes from the operating system to
satisfy an 8192 byte need:


	int	size =	1;
	char	*p =	malloc(size);

	while (size <= 8192)
	{
		p =	realloc( p, size );
		size *=	2;
	}

I find my programs that need a significant amount of memory 
generally consume 5-8 times as much memory as they
need theoretically.

The Berkeley malloc has, essentially, a free list
for each size request, after adding 8 bytes of overhead and rounding up
to the nearest power of two with an 8 byte minimum.

The "worst case" behavior for this,
assuming an actual need of MAX, requires roughly

        MAX*(log (MAX) - 2)
                2

For instance, a program that needed only 8 meg might have malloc
asking for 320 meg from the operating system !!
-- 

|| Tom Stockfisch, UCSD Chemistry	tps at chem.ucsd.edu



More information about the Comp.lang.c mailing list