C common practice. (was: low level optimization)

Michael Meissner meissner at osf.org
Wed May 1 00:37:32 AEST 1991


In article <1991Apr29.220139.28983 at cbnewsh.att.com> daw at cbnewsh.att.com (David Wolverton) writes:

| In article <22649 at lanl.gov>, jlg at cochiti.lanl.gov (Jim Giles) writes:
| > One of the assumptions (which is still valid most places) is that
| > no interprocedural analysis is done - even _within_ a file, even 
| > though the language permits such optimizations.  The result is that
| > there really is a tendency to maintain C code as numerous separate
| > files.  ...[stuff deleted]...
| > I don't maintain that keeping code in separate file is necessarily
| > bad or good.  But to pretend that it is not common practice is to 
| > ignore reality.  This common practice may in (the near) future result 
| > in less efficient code because of missed optimization.
| 
| I can't say about "most places", but I _can_ say that we
| built a commercially-available 3B2 C compiler here in 1986
| that did some interprocedural stuff within a source file.
| 
| Our biggest customer was a project with _thousands_ of
| functions, one per file.  We tried mightily to get them
| to group related functions into a single file, but their
| "project methodology" wouldn't allow it.  So they could
| never take advantage of that group of optimizations.

One way to 'solve' this problem is to have one file #include each of
the different .c files.  That way for debugging, you get fast
turnaround time (only recompile each file that changed in the typical
fashion), and for production you get interprocedural optimizations.
Note typically for compiling everything together, you typically need
lots of physical memory to run the compiler.
--
Michael Meissner	email: meissner at osf.org		phone: 617-621-8861
Open Software Foundation, 11 Cambridge Center, Cambridge, MA, 02142

Considering the flames and intolerance, shouldn't USENET be spelled ABUSENET?



More information about the Comp.lang.c mailing list