Generalizing Integers (was Re: Specifying int size in C)

Stanley Chow schow at bnr-public.uucp
Mon May 8 12:22:40 AEST 1989


In article <166 at mole-end.UUCP> mat at mole-end.UUCP (Mark A Terribile) writes:
> [...]
>The problem with the PL/I approach is that the programmer is encouraged to
>think of such declarations as ``normal.''  The results are strange conversions
>leading to hard-to-find, hard-to-predict bugs which occur on boundaries that
>cannot be easily guessed at in either inspection, walk-through, or white-box
>testing, and leading also to extra code size and execution costs.
>

It seems to me that the declaration of integers can be seperated from the
conversion rules. PL/I happens to have conversion rules that make life very
interesting. This does not mean the *approach* is bad, merely PL/I did not
do this aspect right.

If you claim *all* such attempts are doomed to failure, then I take serious
issue with it.


>Bringing the complexity of code, seen as a cultural phenomanon, under some
>kind of control required going to a simpler model of computation, required
>going from PL/I to B.  As the limits of the simpler model(s) were discovered,
>they were expanded incrementally, first by going to C and then by extending
>C bit by bit.
>

I suggest that the model may be complex, as long as it is easy to reason
within that model. For example, Turing machines are very simple, but I find
it very difficult to do anything with them. On the other hand, LR(k) grammers
are far from simple, but I can write useful parsers with them.



Stanley Chow        BitNet:  schow at BNR.CA
BNR		    UUCP:    ..!psuvax1!BNR.CA.bitnet!schow
(613) 763-2831		     ..!utgpu!bnr-vpa!bnr-fos!schow%bnr-public
I am just a small cog in a big machine. I don't represent nobody.



More information about the Comp.lang.c mailing list