low level optimization

Richard Caley rjc at cstr.ed.ac.uk
Thu Apr 25 23:45:01 AEST 1991


In article <22355 at lanl.gov>, Jim Giles (jg) writes:

jg> In article <RJC.91Apr20023816 at brodie.cstr.ed.ac.uk>, rjc at cstr.ed.ac.uk (Richard Caley) writes:
jg> [...]          the only third way is if 'translation' does not
jg> include code generation (which will subsequently be done by some
jg> post-translation tool: like the loader - just like I keep saying).

rjc> When compiling `together' the compiler creates object files
rjc> containing two complete sets of code, data etc. The linker looks at
rjc> the object files given and if they were not all compiled together (eg
rjc> if you had recompiled one) it uses the non-optimised code.

jg> Same as my third option.  

No, you said the `linker' would have to do code analysis and code
generation, I said it would have to make a choice. If the difference
is not clear, I suggest you try and code them. How big do you think
interprocedureal analysis plus code generation is? I think it is clear
that choosing one of two sets of definitions takes about 100 including
totally paranoid levels of defensive tests.

jg> The load-time tool actually does the interprocedural analysis.

Nope, it doesn't even need to know that the difference between the
alternate definitions _is_ interprocedural analysis.

jg> Your solution merely makes the loader's
jg> 'code generation' duties simpler at the expense of more expensive
jg> compilation and larger object files.  And, again, no implementation
jg> does this (yet).

Larget objects I'll grant you, I never said it was a comercially
viable solution, just the simplest to explain in a news message.

Ps. Re: `common practices'. The only C code I have ever seen written
	one procedure to a file was the result of letting a PL/I and
	PL6 person loose with CC. name deleted to protect the not so
	innocent :-)

--
rjc at cstr.ed.ac.uk			_O_
					 |<



More information about the Comp.lang.c mailing list