Just rambling about optimization...

Chris Torek torek at elf.ee.lbl.gov
Sat May 11 23:56:46 AEST 1991


In article <453 at smds.UUCP> rh at smds.UUCP (Richard Harter) writes:
>The point is that optimization of programs frequently involves better
>design of the data management process.

Indeed.  It has been observed (sorry, no references, but this is not
only popularly known but also true :-) ) that many computers spend most
of their time copying data, rather than `computing'.  This is largely
why a fast bcopy/memcpy/memmove is important.  A fast block copy is
a wonderful thing, but better yet is to eliminate the copies altogether.
If the copies are unavoidable, it may pay to compute while copying,
rather than having the sequence:

	memmove(saved_data, data, len);
	... work over the data ...

---particularly if `working over the data' involves a linear pass over
all the items.  (TCP is a good example of this: it must copy outbound
data somewhere for retransmission, and it must also compute a checksum.
These operations can be done at the same time.)

Of course, when optimizing, the place to start is with a profiler.
-- 
In-Real-Life: Chris Torek, Lawrence Berkeley Lab CSE/EE (+1 415 486 5427)
Berkeley, CA		Domain:	torek at ee.lbl.gov



More information about the Comp.lang.c mailing list