bigger longs (64 bits)

Rick Peralta peralta at pinocchio.Encore.COM
Thu Feb 22 04:41:47 AEST 1990


In article <194 at hico2.UUCP> kak at hico2.UUCP (Kris A. Kugel) writes:
>> >What are the feelings here regarding 64 bit longs?
>
>We are starting to have problems because of the wide variety of
>wordsizes on the machines UNIX runs on.  Does it make sense that
>a long is such a different size on different machines?  What if
>you want a guarenteed precision?  I'm beginning to think that
>some kind of declaration construct  like int(need32) var; is needed.

>The layout of structures is another problem;
>Isn't it about time we bit the bullet and decided that the C language
>needs to support types, structures, and ints that look the same from
>one machine to another?

Standardizing makes infinite sense, but is a logistical monster.

Maybe a switch that can regress to the "old way" (whatever that is)
and defaults to a new standard can be managed.

As for required sizes there is a mechanism: int x:32;

Byte ordering is a real issue.  Casting or declaring a type to have a
particular byte order seems wonderful, 'till you look at what it does
to the compiler folks.  They will have to convert every data item's byte
order for each operation.  (How about: 1234 int x; (1234) x++; (4321) x--;)

Since we're getting the compiler people excited, why not have some fun...
Why can't we have the compiler manage math sizes other than are supported
in the current hardware.  For example: a 16 bit machine with 32 or 64 bit
math.  If the hardware is inadequate, just call a library or inline the
code.  That way math code would no longer be functionally limited by the
hardware.


 - Rick "But it should be put on the standards list..."



More information about the Comp.unix.wizards mailing list