short circuit evaluation - some comments

drw at cullvax.UUCP drw at cullvax.UUCP
Fri Feb 20 02:02:21 AEST 1987


mckeeman at wanginst.EDU (William McKeeman) writes:
> The compiler writer is surely allowed any optimization that will not
> change the program results.  Some will work harder on this than others
> but the programmer should not care.  The question arises when an unlikely
> set of circumstances would change the results (like aborting on overflow).
> This is a hard problem because the compiler writer cannot know what the
> programmer is looking for.  If, for instance, run-time is a legitimate
> "result", then no optimization can be allowed at all.

Well, it depends on how much screwing around with operations the
compiler is permitted by the language definition.  In C, the compiler
is allowed to evaluate (a+b)+c by computing a+(b+c).  For portable
code, if b+c may overflow when a+b doesn't, the programmer must write
+(a+b)+c.  I know that Fortran has careful specifications as to what
the compiler is permitted to do along these lines.

As far as run-time, it is not a legitimate result because the
standards (implicitly) state that run-time is "implementation
defined".  (Flame on:) How can it possibly be a legitimate result???
It changes if you go to a faster processor!!!  (Otherwise why would
you get a faster processor???)  Which one of us is crazy???  (Flame
off.)

The idea I'm trying to get across is that "what the programmer is
looking for" is "what the programmer says", and that is "what the
language definition says it means".  If it isn't clear from the
language definition what *precisely* some construction means (and
*precisely* what that construction leaves unspecified), the language
definition is deficient.

> The safest approach is to warn the programmer when the compiler is about to
> throw away some of his/her carefully crafted C.  A value that is computed,
> and not used is a significant programming error, not an optimization
> opportunity.

It's not an error, because the program is valid.  It is *probably* a
programmer mistake, though, and the programmer should be warned.  But
the program should execute as specified.  (Error:  a portion of a
program which violates the language definition.  Mistake:
something that the programmer did that he didn't intend to do.)

> With that approach, the following fragment would generate
> a single store of 0 to x and several warning messages about discarded code
> for the visible C commands  /, +, *, *p and ++.
> 
> x = 0; x /= 1; x += 0 * *p++;

How about discarding the += also?

> I regard this approach as an indication that the compiler writer respects the
> users of their product; the programmer would not write code unless the
> programmer expected it to matter.  A numercial analyst once complained
> bitterly that my throwing away the assignment
> X = X + 0.0
> prevented him from normalizing floating point numbers on the IBM 7090.

The choice of internal representation of a quantity should be
entirely up to the compiler.  Thus, whether a number is normalized or
unnormalized (or represented as strings of ASCII characters) is
something that the programmer should (a) have no control over, and (b)
have no concern with.  (Other than I/O to/from other subsystems.)
It's precisely this sort of hardware hacking in high-level languages
that makes programs and languages totally nonportable.  At this rate,
we might as well argue that

	#define	fabs(x)	(x & 0x7FFFFFFF)

should work on all machines.

> The invisible side effects are more difficult to deal with at the language
> definition level.  I often hear complaints, backed by debugger output, that
> some assignment never happens.  True enough, the value is held in a register
> for use until context exit, and never ends up in memory.  There is no
> consequence except for a bedeviled programmer trying to find a bug with
> optimizer-obscured clues.  In this case the program is not being looked at as
> an input/output mapping, but rather as a living thing to be examined in vivo.
> 
> A solution to both problems is some sort of construct in C that says "from
> here to here, do everything by the abstract machine rules".

It's called "turn off the optimizer"!  Either that, or have a debugger
that's clever enough to peel apart the optimizations and present
what's going on as if the abstract rules had been followed.
(admittedly, it's hard)

Dale
-- 
Dale Worley		Cullinet Software
UUCP: ...!seismo!harvard!mit-eddie!cullvax!drw
ARPA: cullvax!drw at eddie.mit.edu



More information about the Comp.lang.c mailing list