Memory Models

ody davidsen at sungod.crd.ge.com
Sat Aug 12 01:04:55 AEST 1989


In article <562 at dcscg1.UUCP> drezac at dcscg1.UUCP (Duane L. Rezac) writes:

| I am just getting into C and have a question on Memory Models. I have not
| seen a clear explanation on just what they are and how to determine which 
| one to use. Does anyone have a short, clear explanation of these for  
| someone just starting out in C?  

  I'll provide some information, but bear in mind that models are a
characteristic of the linker, rather than something just in C.
Segmented machines can support the models in all languages including
assembler.

The question is if the code and/or data space is limited to 64k or not.
Here's a table of the common models:

		       code
	       64k             >64k
	 _________________________________
	|                |                |
d  64k  |   small        |   medium       |
a	|________________|________________|
t	|                |                |
a >64k  |   compact      |   large        |
	|________________|________________|

  Two other models are tiny (code and data share the same segment) and
huge, in which array and aggregate objects may be larger than 64k.

  The reason for using the smaller models is performance. Data access is
faster in small or medium model.
	bill davidsen		(davidsen at crdos1.crd.GE.COM)
  {uunet | philabs}!crdgw1!crdos1!davidsen
"Stupidity, like virtue, is its own reward" -me



More information about the Comp.lang.c mailing list