ANSI 'C'.

L.Rosler lr at sftig.UUCP
Fri Nov 22 03:20:24 AEST 1985


> >   The standard has no real business specifying the environment in this
> > amount of detail, or perhaps it should be worded with something like
> > "If you are going to supply XXXX functionality, do it like this...".
> 
> The standard obviously didn't put much thought into anything related to
> times.  They define type "time_t" as an arithmetic type that represents
> the time, and then define a difftime(time_t time1,time_t time2) that
> computes the difference between these two arithmetic values.  Why the
> function?  Does the standard have a new arithmetic type on which the
> operation of subtraction is not allowed?
> And then they define a function gmtime(const time_t *timer) that takes
> a pointer to the arithmetic value, which it is not going to change anyway.
> Why not just take the value itself instead of the pointer?

Several members of the X3J11 Committee are working on a rationale
document that will, I hope, clarify some apparently obscure choices.
It happens that a great deal of thought went into every aspect of
this issue.

The first question was whether methods of determining and reporting
time belonged in the C standard at all, or were properly part of a
system-interface standard such as that being produced by IEEE P1003
(nee /usr/group).  It was decided that the functionality desired
was sufficiently widely available to warrant standardization in the
language.  Provisions were made for environments that didn't have
the capability, by specifying a suitable error return which the
application could check for (the same value as that used by UNIX*,
not by coincidence).

The second question, the need for inventing difftime(), was based
on the concept that though the value returned by time() had to be
arithmetic (in order that the error value, (time_t)-1, be easily
detected), there was no need to burden implementations by requiring
it to be a linear representation of the time, as it happens to be
on UNIX systems.  It need not even be monotonically increasing,
viewed as an integer!  The question of appropriate units for the
difference of two times was finessed by specifying the result of
the subtraction as a double, catering for resolutions of integer
seconds on slow systems or nanoseconds on CRAYs.

The third question, why accept a pointer argument instead of the
value itself, reflects the encrustation of UNIX archaisms on a
standard of this kind.  In olden times, before C even had a "long"
data type, the time was stored in an array of two 16-bit ints,
hence HAD to be moved around via a pointer.  (Remember that the
name of the array serves as a pointer to the first element.)
This is why time_t time(time_t *timer) can return a value via
a side-effect on an argument.  As a C function cannot return an
array, originally this was the ONLY way time() could produce a
result.  When "long" was introduced, old programs continued to
operate through the type mismatch, provided the order of the
int's in the array was suitable to produce a "long".  The
time() function thus acquired a legitimate return value, and
the side-effect was conserved for compatibility.

No one on the Committee is PROUD of this kind of specification,
nor would we design such functions that way ab initio.  But one
of the roles of rational standardization is to preserve what is,
not create what should be.

Thanks for your indulgence.  We really do think hard on
occasion.

Larry Rosler, AT&T
Editor, X3J11 C Standards Committee
(201) 522-5086

* UNIX is a trademark of AT&T.



More information about the Comp.lang.c mailing list